Bibliometrics and the Research Excellence Framework (REF)

Similar documents
Analysis of data from the pilot exercise to develop bibliometric indicators for the REF

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

AN INTRODUCTION TO BIBLIOMETRICS

Bibliometric evaluation and international benchmarking of the UK s physics research

An Introduction to Bibliometrics Ciarán Quinn

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

Focus on bibliometrics and altmetrics

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir

Measuring the reach of your publications using Scopus

Scopus Introduction, Enhancement, Management, Evaluation and Promotion

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

The use of bibliometrics in the Italian Research Evaluation exercises

Bibliometric glossary

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier

Complementary bibliometric analysis of the Educational Science (UV) research specialisation

Horizon 2020 Policy Support Facility

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical)

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

Research metrics. Anne Costigan University of Bradford

Scopus Content Overview

How do NIHR peer review panels use bibliometric information to support their decisions?

Arrangements for: National Progression Award in. Music Business (SCQF level 6) Group Award Code: G9KN 46. Validation date: November 2009

Suggested Publication Categories for a Research Publications Database. Introduction

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database

Elsevier Databases Training

Bibliometrics & Research Impact Measures

To See and To Be Seen: Scopus

What are Bibliometrics?

Scopus. Dénes Kocsis PhD Elsevier freelance trainer

WRoCAH White Rose NETWORK Expressive nonverbal communication in ensemble performance

What is bibliometrics?

What is Web of Science Core Collection? Thomson Reuters Journal Selection Process for Web of Science

esss european summer school for scientometrics 2013 Prof. Dr. Hans-Dieter Daniel

CITATION INDEX AND ANALYSIS DATABASES

Citations and Self Citations of Indian Authors in Library and Information Science: A Study Based on Indian Citation Index

Elsevier Databases Training

Code Number: 174-E 142 Health and Biosciences Libraries

Citation Indexes: The Paradox of Quality

On the relationship between interdisciplinarity and scientific impact

A Correlation Analysis of Normalized Indicators of Citation

Research Playing the impact game how to improve your visibility. Helmien van den Berg Economic and Management Sciences Library 7 th May 2013

Scientometric Profile of Presbyopia in Medline Database

Impact Factors: Scientific Assessment by Numbers


GPLL234 - Choosing the right journal for your research: predatory publishers & open access. March 29, 2017

SIX BFI NETWORK TALENT EXECUTIVES APPOINTED TO REACH AND DEVELOP NEW FILMMAKERS NATIONWIDE

Frequently Asked Questions

Kent Academic Repository

Torture Journal: Journal on Rehabilitation of Torture Victims and Prevention of torture

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by

Quality assessments permeate the

White Rose Research Online URL for this paper: Version: Accepted Version

ISSN: ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 3, Issue 2, March 2014

InCites Indicators Handbook

VISIBILITY OF AFRICAN SCHOLARS IN THE LITERATURE OF BIBLIOMETRICS

Institutes of Technology: Frequently Asked Questions

PBL Netherlands Environmental Assessment Agency (PBL): Research performance analysis ( )

Citation analysis: Web of science, scopus. Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network

Window of Creative Competition for Television BBC Trust review

THE EVALUATION OF GREY LITERATURE USING BIBLIOMETRIC INDICATORS A METHODOLOGICAL PROPOSAL

and Beyond How to become an expert at finding, evaluating, and organising essential readings for your course Tim Eggington and Lindsey Askin

Arrangements for: National Certificate in Music. at SCQF level 5. Group Award Code: GF8A 45. Validation date: June 2012

Citation counts and the Research Assessment Exercise V Archaeology and the 2001 RAE

The largest abstract and citation database

How to get the best out of presubmission enquiries

Practice with PoP: How to use Publish or Perish effectively? Professor Anne-Wil Harzing Middlesex University

European Commission 7th Framework Programme SP4 - Capacities Science in Society 2010 Grant Agreement:

Absolute Relevance? Ranking in the Scholarly Domain. Tamar Sadeh, PhD CNI, Baltimore, MD April 2012

ELSEVIER DATABASES USER TRAINING AND UPDATES. Presented by Ozge Sertdemir October 2017

BBC Television Services Review

Finding Influential journals:

The Debate on Research in the Arts

University of Liverpool Library. Introduction to Journal Bibliometrics and Research Impact. Contents

Introduction to Citation Metrics

International Journal of Library Science and Information Management (IJLSIM)

Towards a Bibliometric Database for the Social Sciences and Humanities A European Scoping Project

UNIVERSITY OF WAIKATO Hamilton. New Zealand. The Merits of Using Citations to Measure Research Output in Economics Departments: The New Zealand Case

Accpeted for publication in the Journal of Korean Medical Science (JKMS)

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

Cited Publications 1 (ISI Indexed) (6 Apr 2012)

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library

Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway

Science Indicators Revisited Science Citation Index versus SCOPUS: A Bibliometric Comparison of Both Citation Databases

Swedish Research Council. SE Stockholm

Scopus in Research Work

Four steps to IoT success

CITATION METRICS WORKSHOP (WEB of SCIENCE)

The problems of field-normalization of bibliometric data and comparison among research institutions: Recent Developments

Policies and Procedures

Ethical Policy for the Journals of the London Mathematical Society

WHO S CITING YOU? TRACKING THE IMPACT OF YOUR RESEARCH PRACTICAL PROFESSOR WORKSHOPS MISSISSIPPI STATE UNIVERSITY LIBRARIES

2nd International Conference on Advances in Social Science, Humanities, and Management (ASSHM 2014)

Authorship Trends and Collaborative Research in Veterinary Sciences: A Bibliometric Study

Access to Excellent Research: Scopus Content in Serbia. Péter Porosz Solution Manager CEE

Should author self- citations be excluded from citation- based research evaluation? Perspective from in- text citation functions

Transcription:

Bibliometrics and the Research Excellence Framework (REF) THIS LEAFLET SUMMARISES THE BROAD APPROACH TO USING BIBLIOMETRICS IN THE REF, AND THE FURTHER WORK THAT IS BEING UNDERTAKEN TO DEVELOP THIS APPROACH. THIS DOCUMENT IS INTENDED FOR ALL RESEARCHERS, RESEARCH MANAGERS AND ADMINISTRATORS WHO HAVE AN INTEREST IN UNDERSTANDING THE RESEARCH EXCELLENCE FRAMEWORK.

What is the REF? The Research Excellence Framework (REF) is the new system being developed by the four UK higher education funding bodies to assess research quality after the 2008 Research Assessment Exercise has been completed. The REF will consist of a single, unified framework for the assessment of research across all subjects. It will make greater use of quantitative indicators in the assessment of research quality than the present system, while taking account of key differences between the different disciplines. Assessment will combine quantitative indicators, including bibliometric indicators wherever these are appropriate, and light-touch expert review. Which of these elements are employed, and the balance between them, will vary as appropriate to each broad subject area. Following initial consultations, detailed proposals for the REF are currently being developed and a pilot of the new bibliometric indicator of research quality is currently underway. The sector will be consulted on detailed proposals for the REF in mid-2009. The four UK higher education funding bodies the Higher Education Funding Council for England (HEFCE), the Scottish Funding Council (SFC), the Higher Education Funding Council for Wales (HEFCW) and the Department for Employment and Learning (DEL) in Northern Ireland are working in partnership to develop a common approach to assessing research quality through the REF. What are bibliometrics? Bibliometrics are ways of measuring patterns of authorship, publication, and the use of literature. Some of these measures can be used to produce proxy indicators of the impact or quality of published research: new research builds on previous research, and researchers acknowledge this by citing earlier papers the extent to which research is cited provides some indication of the impact or influence it has on subsequent research patterns of citation can be measured and used to produce proxy indicators of research quality. Why bibliometrics? Following consultations on the reform of the RAE there was an agreement to make greater use of metrics, to reduce the burden of assessing research quality. We identified the use of bibliometrics as a key element in moving to a more metrics-based system, as bibliometric techniques have the potential to provide robust and usable indicators of research quality across a number of disciplines. However, citation data should be used with caution to construct indicators that can be used in research assessment. They must be constructed using robust methods, the indicators should be interpreted by experts who understand the limitations and the patterns of citation behaviour in that discipline, and they should be used alongside other indicators of research quality. After the pilot stage has been completed, the SFC, HEFCW and DEL will decide whether to implement REF, and whether to use it to inform future funding. HEFCE is currently committed to applying the REF as a unified assessment and funding framework in England, and plans to phase it in from 2010-11, before fully implementing the REF in 2013.

How we produce citation indicators A paper is published and subsequently cited by other papers (for example, a paper receives 24 citations in a given period). This is its citation count, but by itself it tells us little about the impact or influence the paper has had in its field. We calculate the average number of citations for all papers published worldwide in the same field, over the same period. This is the field norm. For our example, let s assume the worldwide average in this field is 16 citations per paper. To understand the impact or influence of the paper in its field, we divide the paper s citation count by the field norm and also take into account the year in which the paper was published and the type of paper, for example whether it is a review paper. This gives us the normalised citation rate, which for our example would be: 24 16=1.5 a citation rate of 1.5 times the field norm. However, the normalised citation rate for an individual paper does not provide a robust indication of its impact or influence. We need to look across a larger body of work, and summarise their normalised citation rates, to provide a meaningful indication. We can summarise the results for substantial groups of papers in the form of a citation profile. This shows how much of that body of work was cited at different rates relative to worldwide norms. An example citation profile This chart is a fictional example of how citation data can be presented as a citation profile, for a substantial group of papers published over a period of several years (for example, a set of papers in a particular discipline produced by a particular institution). 35 % papers in each citation category 30 25 20 15 10 5 0 Uncited Far below world average Below world average Around world average Above world average Far above world average

This approach has been informed by expert advice and consultation with the sector. It is designed to take account of the very different citation patterns in different disciplines and sub-disciplinary fields, and to provide international benchmarking for UK research. Although this approach uses citations per paper as the basic building block for constructing indicators, it is not intended and is not appropriate for assessing either individual papers or individual researchers. The data are not sufficiently informative at individual level and need to be summarised for larger bodies of work to produce meaningful indicators. There are a number of other ways of using citation data to produce indicators, for example journal impact factors, but we will not use such indicators for the REF. Which research outputs and subjects are covered? There are currently two main databases which capture publication and citation information across a broad range of disciplines: ThomsonReuters Web of Science and Elsevier s Scopus. These databases are used widely by academics, researchers and information professionals for a range of purposes. These databases focus mainly on journals (particularly those of international interest) and some other materials, for example, conference proceedings. Given that the types of media used for publishing research varies greatly between different disciplines, the citation databases both cover very different proportions of research published in different disciplines. In broad terms, their current coverage is: a large majority of research outputs are covered in the biological, physical and medical sciences and in psychology How will we use citation indicators in the REF? The REF will produce an overall quality profile for each subject at each institution that is assessed. Quality profiles in each subject will be informed by an appropriate combination of citation indicators, light-touch peer review, and other metrics and information. The use of each of these elements and the balance between them for each subject will be determined after the current development work and further consultation with the sector. We expect that this will result in different elements being employed in different subjects: citation indicators will be used in those subjects where they are found to be robust and meaningful, alongside other indicators and information as appropriate for subjects where bibliometrics and other quantitative indicators are partially informative, they would be used in combination with qualitative elements and possibly some expert review of outputs for subjects where bibliometric indicators are not yet sufficiently mature to be informative, expert review of outputs will be used in combination with other applicable indicators and qualitative information. Assessment in each subject will be overseen by expert panels. The panels will advise on which indicators are appropriate to the characteristics and diversity of research in their subject field, and the weighting between the indicators. They will interpret and combine these indicators (together with peer review judgements as appropriate) to produce an overall quality profile. generally moderate coverage in the health sciences, mathematics, engineering, computer science, economics and geography generally limited coverage in other social sciences, the humanities and the arts.

Pilot of the bibliometrics indicator We have identified a broad approach to constructing citation indicators for the REF, but a number of detailed issues need to be developed further. We are currently running a pilot with 22 higher education institutions to test and refine these issues. Evidence Ltd has been commissioned to work with HEFCE to run the pilot. We will work with the pilot institutions to compile data about relevant staff and publications, and through this process identify the implications for institutions management of research information. The pilot covers all disciplines in which there is at least moderate coverage of citation data, to help inform decisions about which disciplines should be included in future and explore boundary issues. The pilot covers publications in the period 1 January 2001 to 31 December 2007, although decisions about the timeframe for future bibliometric exercises will be taken after further analysis and consultation with the sector. We will work with Evidence Ltd to analyse citations to these papers using the Web of Science, and to some extent Scopus, and produce indicators using a range of different parameters and methods. We will seek advice from subject expert groups on how to interpret the pilot outcomes and help form recommendations about using citation indicators in the REF. Institutions taking part in the pilot: Bangor University University of Bath University of Birmingham Bournemouth University University of Cambridge University of Durham University of East Anglia University of Glasgow Imperial College London Institute of Cancer Research University of Leeds London School of Hygiene and Tropical Medicine University of Nottingham University of Plymouth University of Portsmouth Queens University, Belfast Robert Gordon University Royal Veterinary College University of Southampton University of Stirling University of Sussex University College London

The next consultation Following the pilot, in mid-2009 we will put forward detailed proposals for consultation on the following issues: which disciplines citation indicators should be used for in the REF which staff and publications should be included in the citation analysis how we will choose which citation database(s) to use (including considerations of coverage and data quality) the process for collecting data about publications and researchers, and the implications for institutions management of these data details of the methods for analysing citations (for example, how to define field norms, and how to handle issues such as self-citation) means of constructing the citation profile how the indicator can best be interpreted and used by expert panels alongside other information in the REF. Further information Further information about bibliometrics and the REF can be found on the HEFCE website, www.hefce.ac.uk under Research/Research Excellence Framework. Key documents available on the site include: 'Scoping study on the use of bibliometric analysis to measure the quality of research in UK higher education institutions', Center for Science and Technology Studies, University of Leiden Consultation on the assessment and funding of higher education research post-2008 (HEFCE 2007/34) consultation outcomes (Circular letter 13/2008). To receive updates on the REF by email, join the REF-NEWS mailing list by going to the above web page and following the link. Enquiries should be emailed to ref@hefce.ac.uk Alongside the bibliometrics pilot, we are developing proposals for other aspects of the REF including the choice and use of other indicators, the approach to light-touch peer review and the subject structure for the REF. Our consultation in 2009 will also cover these.