Complementary bibliometric analysis of the Educational Science (UV) research specialisation

Similar documents
Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

PBL Netherlands Environmental Assessment Agency (PBL): Research performance analysis ( )

DISCOVERING JOURNALS Journal Selection & Evaluation

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway

Where to present your results. V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

2013 Environmental Monitoring, Evaluation, and Protection (EMEP) Citation Analysis

Measuring the reach of your publications using Scopus

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

What is bibliometrics?

Cited Publications 1 (ISI Indexed) (6 Apr 2012)

The use of bibliometrics in the Italian Research Evaluation exercises

Web of Science Unlock the full potential of research discovery

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

Scopus Introduction, Enhancement, Management, Evaluation and Promotion

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

Suggested Publication Categories for a Research Publications Database. Introduction

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

ISSN: ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 3, Issue 2, March 2014

Bibliometrics and the Research Excellence Framework (REF)

STRATEGY TOWARDS HIGH IMPACT JOURNAL

Daniel Torres-Salinas EC3. Univ de Navarra and Unv Granada Henk F. Moed CWTS. Leiden University

Your research footprint:

Citation analysis: Web of science, scopus. Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network

Bibliometric analysis of the field of folksonomy research

Workshop Training Materials

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS

InCites Indicators Handbook

An Introduction to Bibliometrics Ciarán Quinn

Alphabetical co-authorship in the social sciences and humanities: evidence from a comprehensive local database 1

POLICY AND PROCEDURES FOR MEASUREMENT OF RESEARCH OUTPUT OF PUBLIC HIGHER EDUCATION INSTITUTIONS MINISTRY OF EDUCATION

Scopus Journal FAQs: Helping to improve the submission & success process for Editors & Publishers

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

Citation-Based Indices of Scholarly Impact: Databases and Norms

Establishing Eligibility As an Outstanding Professor or Researcher 8 C.F.R (i)(3)(i)

Research Output Policy 2015 and DHET Communication: A Summary

STI 2018 Conference Proceedings

Scopus. Dénes Kocsis PhD Elsevier freelance trainer

What is Web of Science Core Collection? Thomson Reuters Journal Selection Process for Web of Science

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

The digital revolution and the future of scientific publishing or Why ERSA's journal REGION is open access

Journal Citation Reports on the Web. Don Sechler Customer Education Science and Scholarly Research

Bibliometric measures for research evaluation

arxiv: v1 [cs.dl] 8 Oct 2014

Rawal Medical Journal An Analysis of Citation Pattern

The real deal! Applying bibliometrics in research assessment and management...

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database

Scientific Quality Assurance by Interactive Peer Review & Public Discussion

UNDERSTANDING JOURNAL METRICS

Bibliometric glossary

Managing your assets in the publication economy

The problems of field-normalization of bibliometric data and comparison among research institutions: Recent Developments

Scientometric and Webometric Methods

LIS Journals in Directory of Open Access Journals: A Study

Contribution of Academics towards University Rankings: South Eastern University of Sri Lanka

Results of the bibliometric study on the Faculty of Veterinary Medicine of the Utrecht University

Publication data collection instructions for researchers 2018

Focus on bibliometrics and altmetrics

Vol. 48, No.1, February

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

Aalborg Universitet. Scaling Analysis of Author Level Bibliometric Indicators Wildgaard, Lorna; Larsen, Birger. Published in: STI 2014 Leiden

Edith Cowan University Government Specifications

Presentation from the EISZ Conference The use and generation of scientific content. Roles for Libraries in Budapest, Hungary Sep 12 th, 2016

Establishing Eligibility as an Outstanding Professor or Researcher

Impact Factors: Scientific Assessment by Numbers

China s Overwhelming Contribution to Scientific Publications

Enrichment process for title submission in STEP

Elsevier Databases Training

Bibliometric evaluation and international benchmarking of the UK s physics research

A bibliometric analysis of the Journal of Academic Librarianship for the period of

Promoting your journal for maximum impact

Annual Report 2010 Revista de Educación

International Journal of Library Science and Information Management (IJLSIM)

Bibliometrics & Research Impact Measures

Journal Citation Reports Your gateway to find the most relevant and impactful journals. Subhasree A. Nag, PhD Solution consultant

Bibliometric analysis for information scientists in the University of Tampere in 2012: some results and discussion on information sources

INTERNATIONAL JOURNAL OF EDUCATIONAL EXCELLENCE (IJEE)

TEACHER/SCHOLAR OF THE YEAR University of Florida TEMPLATE

In basic science the percentage of authoritative references decreases as bibliographies become shorter

Citation & Journal Impact Analysis

Horizon 2020 Policy Support Facility

Microsoft Academic is one year old: the Phoenix is ready to leave the nest

Gandhian Philosophy and Literature: A Citation Study of Gandhi Marg

Calver, M.C. (2016) Reflections on two years of change at Pacific Conservation Biology. Pacific Conservation Biology, 22 (4). p

Scopus in Research Work

Swedish Research Council. SE Stockholm

The Financial Counseling and Planning Indexing Project: Establishing a Correlation Between Indexing, Total Citations, and Library Holdings

About journal BRODOGRADNJA(SHIPBUILDING)

Measuring Academic Impact

Electronic Research Archive of Blekinge Institute of Technology

Ari Fahrial Syam Faculty of Medicine, Universitas Indonesia

University of Liverpool Library. Introduction to Journal Bibliometrics and Research Impact. Contents

Author Workshop: A Guide to Getting Published

Citation Analysis in Research Evaluation

AN INTRODUCTION TO BIBLIOMETRICS

Transcription:

April 28th, 2014 Complementary bibliometric analysis of the Educational Science (UV) research specialisation Per Nyström, librarian Mälardalen University Library per.nystrom@mdh.se +46 (0)21 101 637 Viktor Öman, librarian Mälardalen University Library viktor.oman@mdh.se +46 (0)16 153 727

Contents Introduction and general conditions... 3 Descriptive indicators... 4 Productivity... 5 Impact/prestige... 6 Collaboration... 7 Collaboration maps... 8 Sub-environment indicators... 10 2

Introduction and general conditions The bibliometric part of MER14 is an attempt to statistically assess the performance of the individual research specialisations in terms of their publication activities. The present report acts as a complement to the Bibliometric analysis of Mälardalen University carried out by the Centre for Science and Technology Studies at Leiden University (CWTS, 2014). As the reach of metrics based solely on articles in academic journals may be considered too limited for some subject fields, we will compile key figures from the CWTS analysis and present them alongside additional indicators based on other, more inclusive sources of bibliometric data. Thus we hope to provide the reader with complementary (and sometimes competing) perspectives on the research specialisations publications. Keeping in line with the general outlay of MER14, the basis of this analysis are publications authored by researchers belonging the research specialisations' current roster, published between 2008 and 2013. Publications by former members are thus not included, nor are publications where the contribution of the research specialisations' researchers is solely editorial. Texts not yet published (manuscripts, preprints), oral presentations and posters are also excluded. The analysis will focus on three dimensions of the research specialisations' publication output: productivity, impact/prestige & collaboration. In trying to capture each of these, we will rely on the following databases: DiVA, the Mälardalen University publication repository. Registration in DiVA is mandatory for university employees. For MER14, we've also asked researchers to register publications published while employed elsewhere. DiVA contains mostly scientific material (both refereed and not refereed), as well as some non-scientific publications (popular science, opinion pieces, etcetera). Web of Science (WoS), a collection of databases provided by Thompson Reuters. For this analysis, we've used the five citation indexes 1 containing citation data for both journal articles and conference proceedings. Scopus, provided by Elsevier, contains citation data for serial publications (journals, conference proceeding and book series). The CWTS Citation Index (CI), provided by the CWTS at Leiden University. Based on data from the three main parts 2 of WoS, it contains citation data only for the journals covered in WoS, but includes comparison data for different subject fields. All CI-based indicators in this reports is taken from CWTS (2014), which also contains an in-depth methodological discussion on how those numbers are arrived at. The Norwegian list, which is used nationally for performance based allocation of research funds in Norway (and locally by some Swedish universities). This is a register of academic journals, series, websites and book publishers, ranked according to perceived prestige, with Level 2 being the most prestigious. The value of the bibliometric indicators will depend on how well the above data sources reflect publication traditions and norms of the subject fields in which a research specialisation is active. Because of this, we'll begin by giving a descriptive account of the Educational Science (UV) research specialisation's publications and their coverage in these databases. 3

Descriptive indicators As DiVA contains virtually all publications by current Mälardalen University researchers from the chosen period, it's a natural starting point for this analysis. Below we see the total number of publications in DiVA authored by at least one researcher from the UV research specialisation, as well as the distribution over time and the distribution of publication types 3 : Year Number of publications 2008 89 2009 106 2010 108 2011 102 2012 88 2013 89 Total 582 Table 1. UV DiVA publications Book; 2,7% Other publication types; 4,3% Report; 6,2% Article, review; 2,2% Doctoral thesis (monograph); 1,7% Book chapter; 20,1% Journal article; 41,9% Conference paper; 20,8% Figure 1. UV DiVA publication types Based on the publication information in DiVA, we've searched the other databases utilized in the bibliometric analysis. 4 Because of citation delay, the publication year time span will be shorter for the databases containing citation data. The table below shows the number of the UV research specialisation publications covered in each of these databases, as well as what percentage of the research specialisations publications in DiVA that coverage represents (for corresponding publication years). Database Number of publications Coverage WoS (08-12) 115 23.3 % Scopus (08-12) 111 22.5 % CI (08-12) 101 20.5 % Norwegian list (08-13) 266 45.7 % Table 2. UV publication coverage 4

The coverage percentages presented here corresponds in principle to the external coverage calculated by CWTS (2014, p. 8), though while they measured only against journal articles and reviews, we use the research specialisation s entire DiVA output as baseline. It could be argued that not all publications in DiVA are equally important, especially not when trying to assess scientific impact or prestige. And of course none of the databases above are intended to cover everything researchers publish. The Norwegian list, for instance, only contains publishers that conduct formal peer review and that do not function as an outlet primarily for a single institution (Norwegian Association of Higher Education Institutions, n.d.) 5. However, not all inclusion criteria will be entirely quality based. Especially the focus on articles in English language journals in some of the databases might be problematic if a research specialisation publishes in subject field where publications in other mediums and/or languages are considered important. To facilitate the interpretation of the bibliometric indicators, they will all be marked with the name of database on which they are based (either individually or table-wise). For indicators based on sources other than DiVA, the reader is advised to consult table 2 to put the results in perspective. Productivity The sheer number of publications from a research specialisation will be largely dependent on its size and the chosen time span, so all indicators on publication productivity will be presented divided per researcher and year. We have no records of any hiatuses from research activities that may have occurred during this period, so all researchers are (probably incorrectly) assumed to have been at the disposal of the research specialisation, or their former research groups, for the entire time. Though as the amount of time allocated for research differs depending on staff position, we'll give a separate account of publications authored by senior staff, who generally have more time to engage in research. 6 Furthermore, co-authoring reduces the amount of time and work each researcher has to invest in a publication. We therefore also look at UV researchers' fractionalized share of publications: If, for example, a researcher from the research specialisation co-authors a publication together with two external authors, the research specialisation is credited for a third of that publication (that is, every author is assumed to have contributed equally to the publication). Number of publications Publications, fractionalized Per researcher and year 0,6 0,4 Per senior researcher and year 1,2 0,8 Table 3. UV productivity indicators. Based on DiVA publications 08-13 Table 3 shows that during the period of 08-13, UV researchers (counting both senior and non-senior ones) authored an average of 0,6 publications per year, and their fractionalized share of publications was 0,4 per researcher and year. Looking only at publications authored by senior researcher, both these productivity indicators are twice as high. It should be noted that these figures do not take into consideration publication type. It would be reasonable to expect, for instance, that a research specialisation publishing many books 5

would produce fewer publications per researcher than one that publishes almost exclusively journal articles. However, as this would be difficult to quantify, we have not attempted to adjust for such differences. Impact/prestige When trying to assess the quality of research using bibliometric methods, focus tends to be on either the impact of publications, measured by the citations they receive, or the perceived prestige of the channels that have accepted the manuscript for publication. In both cases, measures are usually constructed to assess recognition from within the scientific community, and other aspects like societal influence falls outside the scope of analysis. The most frequently used sources for citation data are Web of Science and Scopus. Here, we will look at the average number of citations (including self-citations) per publication and year in each of these two databases. As citation frequency varies between subject fields, this indicator should not be used for interdisciplinary comparisons, but could be useful for benchmarking against similarly oriented groups of researchers. For a more fine-tuned measure of citation impact, we've asked the CWTS in Leiden to calculate the mean field-normalized citation score (MNCS). This is a relative indicator: A MNCS value of 1.2, for example, would mean that on average, the research specialisation's articles are cited 20 % more frequently than similar articles (taking into account subject field and publication year). The percentage of articles among the 20 % most highly cited (PPtop20%) is also based on comparison of similar articles. Both also exclude self-citations. UV impact indicators Citations/publication & year (WoS, 08-12) 1,0 Citations/publication & year (Scopus, 08-12) 1,0 MNCS (CI, 08-12) 0,88 PPtop20% (CI, 08-12) 13,24 % Table 4. UV impact indicators. CI data from CWTS report (2014) Table 4 shows that both UV publications available in WoS and in Scopus have received an average of 1 citation per year (from other publications in each of these databases respectively). We also see that that the publications from the UV research specialisation available in the CI database (i.e., articles in WoS journals) have received less citations (MNCS < 1) than other articles of the same kind. UV articles also appear among the top 20 % most cited articles less frequently than the 20 % average. If we turn instead to the prestige of the channels in which the research specialisation publishes, we'll first look at the mean normalized journal score (MNJS). This is calculated by the CWTS using the same principles as the MNCS, but applied to journals instead of individual publications. As journal and article impact does not always correspond, this is more an indication of publication prestige, measuring how influential the journals where the research specialisation publishes are. A MNJS value of 1.2 would mean the research specialisation publishes in journals that, on average, receive 20 % more citations than other comparable journals. 6

Another indicator of prestige is the percentage of publications in Norwegian Level 2 channels. These are publishers deemed leading through an academic approval process, set to comprise roughly one-fifth of the publications produced by an academic or research field (Norwegian Association of Higher Education Institutions, n.d., para. 1.5). This means that a percentage higher than twenty would indicate that the research specialisation publishes in prestigious channels more frequently than the average researcher. UV prestige indicators MNJS (CI, 08-12) 0,93 Level 2 publications (Norwegian list, 08-13) 16,2 % Table 5. UV prestige indicators Table 5 shows that, on the whole, the UV research specialisation publishes in CI journals (i.e., WoS journals) that have a slightly less than average citation impact (MNJS < 1). When publishing in channels regarded as scholarly by the Norwegian definition, the UV research specialisation are accepted in, or seek out, the prestigious Level 2 channels less frequently than others in their subject field, that is, less than 20 % of the time. Collaboration The extent to which the research specialisation's researchers authors publications in collaboration with others can easily be tracked in DiVA, where all authors of a publication is listed. Based on DiVA, we present the average number of authors for the UV research specialisation's publications, as well as the percentage of single author publications. UV collaboration indicators, co-authoring Authors/publication 2,1 Single author publications 43,5 % Table 6. UV collaboration indicators, co-authoring. Based on DiVA publications 08-13 Table 6 shows that the average UV publication is authored by just over two researches, and that single author publications account for 43,5 % of its output. Information on the affiliation of co-authors is, however, largely missing in DiVA. So to be able to tell how common it is for the UV research specialisation to engage in interorganizational and international collaboration, we have once again turned to CWTS for indicators: No collaboration/single institution means that all of a publication's authors are from a single institution. National collaboration means that a publication is co-authored by researchers from two or more institutions, all within the same country. And International collaboration, subsequently, means that researchers affiliated with institutions from at least two different countries have co-authored a publication. (These categories are mutually exclusive, meaning that a publication authored in both national and international collaboration will be classified as International collaboration.) 7

UV collaboration indicators, affiliations No collaboration/single institution 27,12 % National collaboration 31,88 % International collaboration 41 % Table 7. UV collaboration indicators, affiliations. Based on CI publications 08-12 (CWTS, 2014) Table 7 shows that when publishing in CI journals (i.e., WoS journals), 41 % of UV publications are authored in international collaboration, and the rest of the publications fairly equally divided between categories National and No collaboration. Collaboration maps The map shows the names of organizations that appear in the address-fields in the articles in which researchers at UV have collaborated with other authors, found in WoS 2009-2013. Thus, each address that appears in the map means an organization with which the UVresearcher, either have been affiliated, or with which she or he have been collaborated. Therefore, the map gives an idea of the various networks in which the actual researchers at UV historically have been involved. The size of the circles depends on the number of times the organization occurs. Circles with same colors belong to the same cluster. The full UV collaboration network contains 74 organizations. In order to make the map more readable only organizations that occur in more than 2 documents are visible in the map. Of the 74 organizations 27 meet the threshold. All these organizations are bibliographically connected and appear in the map. 8

Figure 2. UV collaboration map. Based on WoS, 08-13 9

The table below shows the number of documents in which the names of the ten most frequently appearing organizations are found. Affiliation Number of documents malardalen univ 77 stockholm univ 31 lund univ 24 uppsala univ 11 jonkoping univ 10 univ s carolina 9 umea univ 8 univ haute alsace 7 univ orebro 5 linkoping univ 4 Table 8. UV publications, most frequently appearing affiliations. Based on WoS 08-13 Sub-environment indicators The following tables present most of the above indicators at the sub-environment level. We do not have access to CI-data for sub-environments, so indicators based on that source are not included. For certain sub-environments, some indicators are based on too few publications for the result to be considered stable, so they might be perceived as less meaningful. Indicators based on less than 30 publications will be shaded grey. The same publication time spans are used here as above: 08-13 for DIVA and the Norwegian list, and 08-12 for WoS and Scopus. Where researchers are dividing their time between subenvironments, per researcher -indicators have been adjusted accordingly. Publications from these researchers will be counted once for each sub-environment, meaning that the total number of publications for the sub-environments combined may exceed that of the research specialisation. Subenvironment No of DiVA pub 08-13 DiVA pub/ researcher & year DiVA pub, fraction./ researcher & year DiVA pub/ senior researcher & year DiVA pub, fraction./ senior researcher & year Math/Applied Mathematics 142 0.8 0.5 1.8 1.0 BUSS 174 0.8 0.5 1.8 1.0 SOLD 111 0.4 0.4 0.8 0.7 MNT 72 0.4 0.2 1.0 0.6 SILU (SISU) 97 0.6 0.5 1.2 0.9 Table 9. UV sub-environments productivity indicators. BUSS = Children and young people in school and in society SOLD = Language Studies and Comparative Literature including Subject Didactics MNT = Mathematics, Science and Engineering Education SILU (SISU) = Society, interculturalism, leadership and evaluation 10

Subenvironment No of WoS pub 08-12 Citations/ WoS pub & year No of Scopus pub 08-12 Citations/ Scopus pub & year No of Norwegian List pub 08-13 % Norwegian list, Level 2 Math/Applied Mathematics 71 1.1 69 0.9 99 17.2 % BUSS 23 0.9 19 1.5 71 14.1 % SOLD 4 0.5 4 0.9 25 8.0 % MNT 12 0.8 14 0.8 36 25.0 % SILU (SISU) 6 0.4 7 0.4 38 13.2 % Table 10. UV sub-environments impact/prestige indicators. Grey figures = <30 publications BUSS = Children and young people in school and in society SOLD = Language Studies and Comparative Literature including Subject Didactics MNT = Mathematics, Science and Engineering Education SILU (SISU) = Society, interculturalism, leadership and evaluation Sub-environment Authors/DiVA pub % Single author DiVA pub Math/Applied Mathematics 2.3 25.4 % BUSS 2.6 24.1 % SOLD 1.6 82.9 % MNT 2.3 31.9 % SILU (SISU) 1.7 61.9 % Table 11. UV sub-environments collaboration indicators. 11

Notes 1 The WoS databases used here are: Science Citation Index Expanded, Social Sciences Citation Index, Arts & Humanities Citation Index, Conference Proceedings Citation Index- Science, and Conference Proceedings Citation Index- Social Science & Humanities 2 The CI database contains data from Science Citation Index Expanded, Social Sciences Citation Index, and Arts & Humanities Citation Index 3 For a complete list of publication types occurring in the DiVA data, see CWTS (2014, p. 25). Note that the category Other publication types presented here may include, but is not limited to, the DiVA category Other. 4 The Norwegian list can be searched here: https://dbh.nsd.uib.no/publiseringskanaler/forside?request_locale=en. A list of which sources are included in the different parts of WoS and in Scopus can be found here: http://www.kth.se/kthb/publicering/bibliometri/faq-biblimetrics/faq/which-journals-andconference-proceedings-are-covered-by-web-of-science-and-scopus-1.378647 5 The Norwegian model for funding allocation also require that individual publications present new insight and are presented in a form that allows the research findings to be verified and/or used in new research activity for them to be regarded as academic (Norwegian Association of Higher Education Institutions, n.d., para. 3.2). This categorization is not easily applicable to DiVA data, as MDH publications obviously haven t been registered with the Norwegian definition of scholarliness in mind. For this analysis, only publications registered in DiVA as having Refereed or Other scientific content have been considered for the Norwegian list indicators. 6 The division between senior and non-senior staff used here may not correspond entirely with which researchers were considered senior in the research specialisations' self-evaluations, as we've based it solely on staff categories. For the bibliometric analysis, the following were categorized as senior: Professors (Professors, promoted senior lecturers); Adjunct professors; Visiting professors; Manager; Senior lecturers (Senior lecturers, promoted lecturers); Adjunct senior lecturers; Researchers; Research engineers; Associate senior lecturers References Center for Science and Technology Studies (2014). Bibliometric analysis of Mälardalen University. Leiden: Leiden University. Norwegian Association of Higher Education Institutions (n.d.). A Bibliometric Model for Performance-based Budgeting of Research Institutions. Oslo: Norwegian Association of Higher Education Institutions. Retrieved from http://www.uhr.no/documents/rapport_fra_uhr_prosjektet_4_11_engcjs_ende lig_versjon_av_hele_oversettelsen.pdf 12