Bibliometric report

Similar documents
Results of the bibliometric study on the Faculty of Veterinary Medicine of the Utrecht University

PBL Netherlands Environmental Assessment Agency (PBL): Research performance analysis ( )

Bibliometric Analyses of World Science

The real deal! Applying bibliometrics in research assessment and management...

Developing library services to support Research and Development (R&D): The journey to developing relationships.

Using InCites for strategic planning and research monitoring in St.Petersburg State University

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

News Analysis of University Research Outcome as evident from Newspapers Inclusion

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

Citation Analysis in Research Evaluation

Focus on bibliometrics and altmetrics

Keywords: Publications, Citation Impact, Scholarly Productivity, Scopus, Web of Science, Iran.

Web of Science Unlock the full potential of research discovery

F1000 recommendations as a new data source for research evaluation: A comparison with citations

Characterizing the highly cited articles: a large-scale bibliometric analysis of the top 1% most cited research

Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

FROM IMPACT FACTOR TO EIGENFACTOR An introduction to journal impact measures

Publication Output and Citation Impact

Methods for the generation of normalized citation impact scores. in bibliometrics: Which method best reflects the judgements of experts?

BIBLIOMETRIC REPORT. Netherlands Bureau for Economic Policy Analysis (CPB) research performance analysis ( ) October 6 th, 2015

2013 Environmental Monitoring, Evaluation, and Protection (EMEP) Citation Analysis

Predicting the Importance of Current Papers

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS

The Impact Factor and other bibliometric indicators Key indicators of journal citation impact

Source normalized indicators of citation impact: An overview of different approaches and an empirical comparison

Analysis of data from the pilot exercise to develop bibliometric indicators for the REF

A bibliometric analysis of publications by staff from Mid Yorkshire Hospitals NHS Trust,

Swedish Research Council. SE Stockholm

In basic science the percentage of authoritative references decreases as bibliographies become shorter

Kent Academic Repository

Appalachian College of Pharmacy. Library and Learning Resource Center. Collection Development Policy

2nd International Conference on Advances in Social Science, Humanities, and Management (ASSHM 2014)

Classic papers: déjà vu, a step further in the bibliometric exploitation of Google Scholar

Citation analysis may severely underestimate the impact of clinical research as compared to basic research

A systematic empirical comparison of different approaches for normalizing citation impact indicators

Scientometric and Webometric Methods

UNIVERSITY OF ROCHESTER SPRING 2013 COURSE CREDIT HOURS AND ENROLLMENT BY SCHOOL AND SUBJECT AREA School of Arts & Sciences

On the relationship between interdisciplinarity and scientific impact

UNIVERSITY OF ROCHESTER FALL 2012 COURSE CREDIT HOURS AND ENROLLMENT BY SCHOOL AND SUBJECT AREA

UNIVERSITY OF ROCHESTER

HIGHLY CITED PAPERS IN SLOVENIA

Complementary bibliometric analysis of the Educational Science (UV) research specialisation

On the causes of subject-specific citation rates in Web of Science.

Bibliometrics and the Research Excellence Framework (REF)

InCites Indicators Handbook

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

Cooperation between Turkish researchers and Oxford University Press. Avanos October 2017


1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

I Had a Dream... about Uncitedness

WEB OF SCIENCE JOURNAL SELECTION PROCESS THE PATHWAY TO EXCELLENCE IN SCHOLARLY COMMUNICATION

The subject headings are given as a 5-digit number, the digits 1 5 each having their own meaning.

Edited Volumes, Monographs, and Book Chapters in the Book Citation Index. (BCI) and Science Citation Index (SCI, SoSCI, A&HCI)

Citation analysis: State of the art, good practices, and future developments

Journal Article Share

Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway

Scientometric Measures in Scientometric, Technometric, Bibliometrics, Informetric, Webometric Research Publications

Edited volumes, monographs and book chapters in the Book Citation Index (BKCI) and Science Citation Index (SCI, SoSCI, A&HCI)

Scientometric Profile of Presbyopia in Medline Database

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical)

Percentile Rank and Author Superiority Indexes for Evaluating Individual Journal Articles and the Author's Overall Citation Performance

Citation Performance of Malaysian Scholarly Journals In the Web of Science

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

Citation analysis: Web of science, scopus. Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network

Keywords: Open Access, E-books, Electronic Books, Directory of Open Access Books, Health Sciences.

The journal relative impact: an indicator for journal assessment

What is bibliometrics?

Baylor College of Medicine The Graduate School of Biomedical Sciences

1. How often do you use print books?

What is Web of Science Core Collection? Thomson Reuters Journal Selection Process for Web of Science

F. W. Lancaster: A Bibliometric Analysis

Bibliometric analysis of publications from North Korea indexed in the Web of Science Core Collection from 1988 to 2016

Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus

Instructions to Authors

Bibliometric evaluation and international benchmarking of the UK s physics research

Figures in Scientific Open Access Publications

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

DISCOVERING JOURNALS Journal Selection & Evaluation

CITATION CLASSES 1 : A NOVEL INDICATOR BASE TO CLASSIFY SCIENTIFIC OUTPUT

Horizon 2020 Policy Support Facility

AN INTRODUCTION TO BIBLIOMETRICS

To See and To Be Seen: Scopus

Taiwan Medical and Life Science Citation Indexing System

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

Cited Publications 1 (ISI Indexed) (6 Apr 2012)

EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS

Elsevier Databases Training

Authorship Trends and Collaborative Research in Veterinary Sciences: A Bibliometric Study

Trends in Russian research output indexed in Scopus and Web of Science

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by

Normalizing Google Scholar data for use in research evaluation

Information Use by PhD Students in Agriculture and Biology: A Dissertation Citation Analysis

Introduction to Citation Metrics

Research in Information and Communication Technology in Norway

Abstract. Introduction

ACADEMIC PROGRAMS. Academic Programs San Francisco State University Bulletin

hprints , version 1-1 Oct 2008

Domains of Inquiry (An Instrumental Model) and the Theory of Evolution. American Scientific Affiliation, 21 July, 2012

Title. Author(s) 北海道大学北キャンパス図書室. Issue Date Doc URL. Rights(URL) Type. Note

REFERENCES MADE AND CITATIONS RECEIVED BY SCIENTIFIC ARTICLES

Transcription:

TUT Research Assessment Exercise 2011 Bibliometric report 2005-2010 Contents 1 Introduction... 1 2 Principles of bibliometric analysis... 2 3 TUT Bibliometric analysis... 4 4 Results of the TUT bibliometric analysis... 6 4.1 Relevance of bibliometric methods... 8 4.2 MNCS by Units of Assessment... 9 4.3 Trend analyses for TUT...10 4.4 Performance by collaboration type...11 4.5 Distribution of papers in subject fields with high MNCS...12 4.6 Impact and robustness...13 4.7 MNCS in relation to MNJS...14 5 Summary of the results...15 Appendixes...16 Appendix 1. Panels and Units of Assessment for TUT RAE 2011...16 Appendix 2. Indicators...17 Appendix 3. Classification of journal categories into 15 disciplines...19 Appendix 4. WoS subject fields in which TUT researchers have published and number of fractionalized papers (Pf)...20 References...24

Abbreviations A&HCI Arts & Humanities Citation Index CWTS Center for Science and Technology Studies, Leiden University INT_COV Internal coverage, i.e. the average amount of references covered by the WoS MCS Number of citations per publication, excluding self-citations MNCS Field-normalized citation impact MNJS Field-normalized journal impact P Number of publications Pf Number of fractionalized publications NPHCP10 Field-normalized proportion highly cited publications (top 10%) PNC Percentage of uncited publications PSC Percentage of self-citations SCI Science Citation Index Expanded SSCI Social Sciences Citation Index TCS Total number of citations, excluding self-citations VITALITY Recency of references WoS Thomson Reuters Web of Science Database

1 Introduction As part of the Tampere University of Technology Research Assessment Exercise, a bibliometric analysis of publications was conducted in 2011. The analysis covers publications from the years 2005-2010 and citations to these publications from the years 2005-2011. The analysis is based on 2,484 papers, published by researchers who were employed by Tampere University of Technology on the Census Date of 31 December 2010. The study is based on a quantitative analysis of scientific papers that appeared in international journals and serials indexed in the Thomson Reuters Web of Science (WoS) indices Science Citation Index Expanded SCI, Social Sciences Citation Index SSCI, and Arts & Humanities Citation Index A&HCI. The papers are compared to the international level. Only papers classified as article, review, or proceedings paper in the database were considered. All the publications produced by TUT researchers in 2005-2010 were extracted from the web-based WoS search interface. This means that all the publications that list a TUT researcher either as the first author or co-author were included. Information entered into the databases concerning the author s institutional affiliation is often incomplete and/or inaccurate and there are publications authored by other scientists with the same name. Not all articles list the institutional addresses of their authors, and the names of many organizations may appear in several variations. (CWTS 2007) Therefore, all researchers were asked to verify that the publication lists were correct and complete. Almost all the researchers responded to the request. The lists were delivered to the CWTS Center for Science and Technology Studies, Leiden University, which carried out the bibliometric analyses and reported the results in September 2011. The analyses were conducted using CWTS standard indicators and methods, described in more detail later on in sections 3 and 4. This report presents bibliometric indicators for the whole university, for each faculty (panel) and each unit of assessment for the period 2005-2010 (see Appendix 1). In addition, more detailed analyses are conducted by combining indicators. These analyses focus mainly on the university-level, since all units do not have a sufficient amount of WoS publications for statistically reliable analyses. For analyses that present results on unit-level, only units with more than 40 WoS-publication have been included. 1

2 Principles of bibliometric analysis Using bibliometric methods to assess research performance is based on the assumption that researchers with important information actively seek to publish their findings in international journals where they are read and cited by other researchers. The scientific impact of an article can be considered to be expressed by the number of citations that it receives. Bibliometric analysis is a method to assess the national and international visibility and influence of scientific work. Bibliometric methods are used to analyse the number of scientific articles published by a selected number of authors, citations to these articles and connections between articles, authors and subjects. (van Raan 2005; Karolinska Institutet Bibliometrics Project Group 2008) It is important to keep in mind that citations measure impact rather than quality. Although citation impact can be conceived as an aspect of research quality, it does not fully capture the latter concept. (Moed 2005) Authors may have different motives to cite a certain text, for example, an article may attract a lot of criticism and therefore has a high citation count (negative citing). Another issue is self-citing. In theory, it is a way of manipulating citation rates but the practice of citing oneself is also both common and reasonable. Since scientists tend to build on their own work, a high self-citation count generally indicates a narrow specialty. (Garfield, 1983) Excluding all self-citations is a common practice in bibliometric analyses. Database The Thomson Reuters Web of Science does not claim to provide complete coverage of all journals that are used in scholarly research, but it claims to include the most important or useful ones (Moed 2005). In total, WoS covers over 12,000 major journals. It is dominated by journals from the US and the EU. The database uses 255 different subject fields and each journal is classified as belonging to one or several (maximum 7) of these fields. Studies have shown that the database is representative of scientific publishing activities for most major countries and fields, excluding soft social sciences and humanities (Carpenter & Narin 1981, Moed 2005; ref. Sandström 2009). Still it should be kept in mind that, although fields that prefer other publication forums than journal literature are less well represented, the analysis is based on field normalized data to avoid bias. See Table 1 for overall WoS coverage. For more detailed information, see Appendix 3. Table 1. WoS coverage per discipline (Moed 2010, 126) Excellent Very good Good Poor x>80% 60%<x<80% 40%<x<60% x<40% Molecular biology & biochemistry Applied physics & chemistry Mathematics Other social sciences Biological sciences related to humans Biological sciences - animals and plants Economics Humanities & arts Chemistry Psychology & psychiatry Engineering Clinical medicine Geosciences Physics & astronomy Other social sciences - medicine & health 2

Fractionalization Because many publications are the result of collaborative efforts involving more than one university, different principles and counting methods are applied in bibliometric studies. The most common is whole counting, where every university gets full credit for all papers coauthored with another institution. An alternative is fractionalized counting where the credit is divided equally between all the contributing author addresses. (Piro 2011) Normalization Normalization procedures produce results that are relative to the world average, i.e. the average citation rate in the Web of Science. Bibliometric analyses, including citation level normalizations (which means comparisons to other publications of equal type), take into account the type of paper, since different document types tend to have different citation characteristics, the specific years in which the papers were published, and the subject fields into which the papers are classified. For example, the number of citations received in 2005-2010 by an article published by a research unit in 2005 in field X is compared with the average number of citations received during the same period (2005-2010) by all articles published in the same field (X) in the same year (2005). Generally, a research unit publishes its papers in several WoS subject fields rather than one. When a journal is classified into multiple subfields, citation scores are computed according to their number of field assignments. In other words, a paper in a journal classified into N subfields is counted as 1/N paper in each subfield and so are its citations. (CWTS 2007) 3

3 TUT Bibliometric analysis The citation analyses in this report are based on WoS citation data where self-citations of all authors have been excluded according to the commonly used international standard. Citations are considered self-citations and excluded from the analysis, if the cited and citing papers have at least one author in common. While conducting the analyses for TUT, CWTS Center for Science and Technology Studies (Leiden University) used their standard indicators and methods. For methods, see Appendix 2. The indicators used are as follows: P The total number of papers published by the unit during the entire period. TCS (Total Citation Score) The total number of citations received by P during the entire period, excluding self-citations. MCS (Mean Citation Score) The average number of citations per publication, excluding self-citations. Pnc (Percentage not cited) The percentage of articles not cited during the period under review, excluding self-citations. MNCS (Mean Normalized Citation Score) The paper s citation score in comparison to the international level in the field, the type of paper (e.g. article, review, and proceedings paper), and the specific years and fields in which the research units papers were published (1=world average). MNJS (Mean Normalized Journal Score) The citation score of the journals in which TUT researchers publish in comparison to the international level in the field (1=world average). NPHCP10 The field-normalized proportion of publications belonging to the top 10 percent in terms of citation impact. If the value of NPCHP10 exceeds 1, then the unit of assessment has more papers among the top 10 percent than the world average in the same field. Int_Cov (Internal Coverage) The indicator describes the average amount of TUT publications references that are covered by WoS. The degree of referring towards other WoS indexed literature indicates the importance of WoS indexed journal literature in the scientific communication process. This indicator is used for assessing the relevance of bibliometric methods. 4

Vitality The indicator measures the average age of references per unit of assessment. The indicator is normalized to the world average, which is 0 (for other indicators it is 1). Psc The percentage of self-citations. Additional analyses 1 : Collaboration An entity's (university/panel/unit of assessment) output is divided into two types: national and international collaboration. These two types are analysed to measure impact (MNCS). Research profile The analysis provides detailed information on TUT s scientific output across fields, and as such gives an insight into the disciplinary composition of the output of TUT, as well as the fields in which the university has been influential. 1 These analyses do not belong to the CWTS standard indicators and are therefore named as additional. 5

4 Results of the TUT bibliometric analysis 2 This chapter introduces the results of TUT s bibliometric analysis. The first three tables present the set of indicators applied at all the three levels: the university, panel and unit of assessment. The results are described in more detail in Figures 1-7. The relevance of applying bibliometric methods for analysing TUT s publication performance is presented in Figure 1. In the following figures (2-7), main bibliometric indicators are used to describe the international influence and visibility of TUT s research. When combining indicators, generally accepted and commonly used combinations have been adopted (see, e.g. van Leeuwen et al. 2003). Figures 3, 4 and 5 are based on university-level data, but Figures 2, 6 and 7 only include those units of assessment whose results are statistically relevant (minimum of 40 WoS publications). Table 2. TUT results 2005-2010. Symbol Indicator Score 2005-2010 P Number of publications 2,484 TCS Total number of citations, excluding self-citations 13,063 MCS Number of citations per publication, excluding self-citations 5.26 PNC Percentage of uncited publications 32 % MNCS Field-normalized citation impact 1.10 MNJS Field-normalized journal impact 1.11 NPHCP10 Field-normalized proportion highly cited publications (top 10%) 1.13 INT_COV Internal coverage, i.e. the average amount of references covered by the WoS 72 % VITALITY Recency of references 0.08 PSC Percentage of self-citations 28 % Table 3. Results by panels 2005-2010 3. Panel P TCS MCS PNC MNCS MNJS NPHCP10 INT_COV VITALITY PCS Panel 1 256 773 3.02 0.35 1.04 0.93 0.92 0.60 0.09 0.23 Panel 2 21 42 2.00 0.38 1.39 0.89 1.61 0.27-0.06 0.28 Panel 3 37 103 2.78 0.43 0.87 0.85 0.87 0.36 0.04 0.10 Panel 4 896 3,589 4.01 0.42 0.93 1.02 0.83 0.60 0.08 0.23 Panel 5 1,339 8,706 6.50 0.26 1.20 1.21 1.31 0.81 0.08 0.31 University 2,484 13,063 5.26 0.32 1.10 1.11 1.13 0.72 0.08 0.28 2 For all tables and figures data analysis: CWTS, Leiden University; data source: Thomson Scientific/WoS. 3 Papers that are published in collaboration between faculties (panels) are counted once per each collaborating faculty. Therefore, when adding up faculties publications, the total number is higher than the number of TUT publications. 6

Table 4. Results by Units of Assessment 2005-2010 4. Unit of assessment P TCS MCS PNC MNCS MNJS NPHC P10 INT_C OV VITAL ITY PCS Panel 1: Faculty of Automation, Mechanical and Materials Engineering Dept of Automation Science and Engineering 40 77 1,93 0,38 0,88 0,88 0,26 0,41 0,09 0,25 Dept of Intelligent Hydraulics and Automation* 15 22 1,47 0,53 0,93 0,83 0,59 0,32 0,33 0,15 Dept of Materials Science 161 589 3,66 0,30 1,17 0,94 1,21 0,71 0,08 0,24 Dept of Mechanics and Design* 26 35 1,35 0,46 0,65 0,98 0,37 0,47-0,03 0,24 Dept of Production Engineering* 17 50 2,94 0,47 0,72 0,82 0,73 0,30 0,18 0,07 Panel 2: Faculty of Built Environment Dept of Civil Engineering* 21 42 2,00 0,38 1,39 0,89 1,61 0,27-0,06 0,28 School of Architecture* 0 0 0,00 0,00 0,00 0,00 0,00 0,00 0,00 0,00 Panel 3: Faculty of Business and Technology Management Dept of Business Inf. Manag. and Logistics* 13 10 0,77 0,54 0,32 0,69 0,00 0,28 0,07 0,09 Dept of Industrial Management* 18 85 4,72 0,28 1,44 0,98 1,80 0,46 0,07 0,11 Pori Unit s research in Industrial Management* 6 8 1,33 0,67 0,34 0,82 0,00 0,16-0,11 0,00 Panel 4: Faculty of Computing and Electrical Engineering Dept of Electrical Energy Engineering 61 144 2,36 0,46 0,72 1,24 0,78 0,51 0,12 0,35 Dept of Communications Engineering 60 91 1,52 0,48 0,69 0,75 0,44 0,35 0,13 0,24 Dept of Computer Systems 63 46 0,73 0,62 0,37 0,60 0,05 0,32 0,10 0,34 Dept of Electronics 178 541 3,04 0,48 1,05 0,85 1,20 0,53 0,06 0,26 Dept of Signal Processing 470 2658 5,66 0,33 1,07 1,19 0,88 0,70 0,07 0,21 Dept of Software Systems 40 22 0,55 0,70 0,29 0,52 0,14 0,22 0,08 0,29 Pori Unit s res. in Electronics and Inf. Tech. 47 131 2,79 0,49 0,90 0,99 1,06 0,60 0,12 0,21 Panel 5: Faculty of Science and Environmental Engineering Dept of Biomedical Engineering 207 808 3,90 0,33 0,66 0,83 0,50 0,79 0,05 0,27 Dept of Chemistry and Bioengineering 268 1792 6,69 0,25 1,12 1,21 1,16 0,82 0,04 0,29 Dept of Energy and Process Engineering 46 119 2,59 0,37 0,80 1,02 0,66 0,59 0,10 0,30 Dept of Mathematics 94 304 3,23 0,47 0,63 0,99 0,22 0,56 0,00 0,45 Dept of Physics 476 4662 9,79 0,15 1,66 1,37 2,14 0,85 0,10 0,29 Optoelectronics Research Centre 287 1098 3,83 0,33 1,04 1,32 0,99 0,87 0,14 0,37 * If a unit has less than 40 WoS publications, the results are not statistically reliable. 4 Papers that are published in collaboration between units of assessment are counted once for each collaborating unit. Therefore, when adding up the units publications, the total number is higher than the number of TUT publications. 7

Int_Cov 4.1 Relevance of bibliometric methods When using bibliometric methods, it is important to assess the applicability of the methods. Analyses can be conducted on the adequacy of the citation indexes across disciplines based on the reference behavior of researchers themselves. The indicator Internal Coverage Int_Cov gives the proportion of WoS indexed references used in TUT publications. The indicator is calculated by analysing the reference lists of TUT publications and classifying the references into two categories: WoS and non-wos. The indicator Int_Cov is calculated by dividing the number of WoS references by the number of all references. The degree of referring towards other WoS indexed literature indicates the importance of WoS indexed journal literature in the scientific communication process. The interpretation of the indicator Int_Cov is that the higher the percentage, the more probable it is that a unit s publications are indexed in WoS. According to CWTS, for units with Int_Cov lower than 40 percent, WoS based analysis is less relevant. 5 For the analyses to be statistically relevant, units under scrutiny should also have enough publications in WoS. For instance, the limit used by CWTS is a minimum of 50 publications. For these analyses, the limit has been set to the minimum of 40 publications. In TUT, 15 units have more than 40 publications. 1,00 0,90 0,80 0,70 0,60 0,50 0,40 0,30 0,20 0,10 0,00 0 50 100 150 200 250 300 350 400 450 500 P Figure 1. Relevance of bibliometric methods. 5 Int_Cov>80% Excellent; 60%<Int_Cov<80% Very good; 40%<Int_Cov<60% Good; Int_Cov<40% Poor 8

4.2 MNCS by Units of Assessment MNCS is the field-normalized number of citations per publication (also known as the crownindicator ). It describes the level of citedness for the unit s publications compared to the world average (=1). For figure 2, MNCS has been categorized into three groups: MNCS < 0.80 low 0.80 MNCS 1.20 average MNCS > 1.20 high TUT units of assessment (with P > 40) have been categorized according to their MNCS. 7 % 53 % 40 % x< 0.80 0.80 x 1.20 x> 1.20 n=15 Figure 2. MNCS by Units of Assessment 9

4.3 Trend analyses for TUT The trend analyses conducted for TUT aggregates four publication years into three periods (2005-2008; 2006-2009; 2007-2010) and measures citedness until the most recent moment, July 2011. Figure 3 presents the development of the main indicators (MNCS, NPHCP10, and MNJS). MNCS is the field-normalized number of citations per publication (also known as the crown-indicator ). It describes the level of citedness for TUT publications compared to the world average (=1). NPHCP10 indicates the field-normalized proportion of TUT publications which are included in the highly cited top 10 percent. MNJS is the field-normalized average journal impact, and it describes the level of the journals in which TUT researchers publish their papers compared to the world average. As can be seen, all the indicators exceed the world average and the trend has been stable during the periods under review. 1,40 1,20 1,00 0,80 0,60 MNCS NPHCP10 MNJS 0,40 0,20 0,00 2005-2008 2006-2009 2007-2010 Figure 3. Main indicators (MNCS, NPCHP10, MNJS) (World average=1). 10

4.4 Performance by collaboration type Figure 4 presents the distribution of papers, published either in national or international collaboration, into three categories of MNCS: high (MNCS > 1.20), average (0.80 MNCS 1.20) and low (MNCS < 0.80). This analysis is based on unit-level data. For each unit, collaboration has been divided into international or national based on the author addresses in WoS. MNCS has been calculated for these two groups for each unit, and the results have been combined in the figure below. At the university-level, MNCS for papers published in international collaboration is 1.34, and in national collaboration, 0.92. 100 % 90 % 80 % 70 % 60 % 50 % 40 % MNCS High MNCS Avg MNCS Low 30 % 20 % 10 % 0 % International Figure 4. Performance by collaboration type National 11

P (%) 4.5 Distribution of papers in subject fields with high MNCS The Thomson Reuters WoS databases use 255 different subject fields and each journal is classified as belonging to one or several (maximum 7) of these fields. TUT papers can be classified into 145 WoS subject fields, see Appendix 4 for further details. Figure 5 presents the distribution of TUT s WoS indexed articles with a high MNCS (> 1.20) in different journal subject fields (as percentage). Only subject fields that include 10 or more papers are presented in the figure. In total, 840 papers (of the 2,484) belong to the highly cited group. It should be noted that the subject fields are not representative of TUT departments. 25 20 15 10 5 0 WoS Subject Field Figure 5. Distribution of papers in WoS subject fields with high MNCS 2005-2010. (MNCS>1.20, P 10, n=840). 12

MNCS 4.6 Impact and robustness In Figure 6, the unit of assessment s value for NPHCP10 has been placed on the horizontal axis, and the value for MNCS has been placed on the vertical axis. By combining the two indicators, MNCS and NPHCP10, the publication activity of the unit can be described as a whole. It gives an idea of how robust the field normalized indicator MNCS is. A unit with a high MNCS might have one paper with a high citation impact, in which case the high impact value is based only on that one paper. On the other hand, a unit with a high MNCS and NPHCP10 that exceeds the international average can be interpreted as having good and robust publication practices. 1,80 1,60 1,40 1,20 1,00 0,80 0,60 0,40 0,20 0,00 0,00 0,50 1,00 1,50 2,00 2,50 NPHCP10 Figure 6. Impact and robustness. Only units of assessment with P>40 are included. 13

MNCS 4.7 MNCS in relation to MNJS In Figure 7, the unit of assessment s value for MNJS has been placed on the horizontal axis, and the value for MNCS has been placed on the vertical axis. The MNJS indicator refers to the field-normalized average journal impact and describes the impact of the journals, in which TUT researchers publish their papers. In other words, it describes the researchers level of ambition when choosing the journal in which to publish their results. Square 1: units publish their papers in high-impact journals and the number of citations they receive exceeds the world average in their field. Square 2: units publish their papers in high-impact journals and the number of citations they receive falls below the world average in their field. Square 3: units publish their papers in low-impact journals and the number of citations they receive falls below the world average in their field. Square 4: units publish their papers in low-impact journals and the number of citations they receive exceeds the world average in their field. 1,80 1,60 4 1 1,40 1,20 1,00 0,80 0,60 0,40 0,20 3 2 0,00 0,00 0,20 0,40 0,60 0,80 1,00 1,20 1,40 1,60 MNJS Figure 7. MNCS in relation to MNJS. Only units of assessment with P>40 are included. 14

5 Summary of the results This report describes the international visibility and influence of TUT s scientific work as indicated by the Web of Science. When describing scientific impact, MNCS (also known as the crown indicator ), NPHCP10 and MNJS are commonly used as the main indicators. For TUT, MNCS is 1.10 for the period 2005-2010. This means that TUT publications are cited 10 percent more compared to the world average in their fields. The value for NPHCP10 is 1.13. This means that TUT researchers papers are included in the highly cited top 10 percent of their fields 13 percent more than the world average. The journal normalized indicator, MNJS, is 1.11. MNJS describes researchers ambition concerning the publication forum. Throughout the assessment period, all three indicators have clearly exceeded the world average and have remained stable. TUT publications have a high citation impact in several subject fields. For the entire assessment period, TUT s field normalized indicator MNCS has remained slightly lower (1.10) than the journal normalized indicator MNJS (1.11). This suggests that although TUT researchers publish their papers in journals with a higher than average impact, the number of citations that they receive falls below expectations. One explanation for this could be that the assessment period is somewhat limited and the analysis includes quite recently published papers. In order to determine the extent to which bibliometric methods are relevant in analysing the publishing activity of TUT s units of assessment, two indicators were used: the number of publications (P) and the average amount of references covered by WoS (Int_Cov). In line with these two indicators and their requirements, bibliometric methods were found to be relevant for 15 units of assessment. This means that approximately one third of TUT s units cannot be bibliometrically analysed using WoS as the database. There can be a number of reasons for this, for instance, units publish their results in conference articles or reports, which are not included in WoS. However, the value for Int_Cov for the whole university is 72 percent, which indicates a very good WoS coverage. Bibliometric indicators can be combined to produce more in-depth knowledge on scientific impact. For instance, by combining the two indicators, MNCS and NPHCP10, the publication activity of a unit can be described as a whole. (Figure 6) As the value of MNCS can be determined based on a very low number of highly cited papers, combining the two gives an idea of how robust the field normalized indicator MNCS is. According to this analysis, six TUT units can be interpreted as having good and robust publication practices. Another combination of indicators used in this report is MNCS in relation to MNJS (Figure 7). Of the 15 units under review, four publish their papers in high-impact journals and the number of citations they receive exceeds the world average in their field. For units placed in square 4, the impact of their research outputs is at a high international level, but the publication forums used are not the most ambitious choices. 15

Appendixes Appendix 1. Panels and Units of Assessment for TUT RAE 2011 Panel 1: Faculty of Automation, Mechanical and Materials Engineering Department of Automation Science and Engineering Department of Intelligent Hydraulics and Automation Department of Materials Science Department of Mechanics and Design Department of Production Engineering Panel 2: Faculty of Built Environment Department of Civil Engineering School of Architecture Panel 3: Faculty of Business and Technology Management Department of Business Information Management and Logistics Department of Industrial Management Pori Unit s research in Industrial Management* Panel 4: Faculty of Computing and Electrical Engineering Department of Electrical Energy Engineering Department of Communications Engineering Department of Computer Systems Department of Electronics Department of Signal Processing Department of Software Systems Pori Unit s research in Electronics and Information Technology* Panel 5: Faculty of Science and Environmental Engineering Department of Biomedical Engineering Department of Chemistry and Bioengineering Department of Energy and Process Engineering Department of Mathematics Department of Physics Optoelectronics Research Centre * The Pori Unit is an off-campus unit in the city of Pori. The research conducted at the Pori Unit is closely related to the research activities of two faculties and thus falls under two Panels. 16

Appendix 2. Indicators P Number of publications. TCS Total number of citations, excluding self-citations. MCS Number of citations per publication, excluding self-citations. Pnc Percentage of uncited publications. MNCS 6 The publication (paper) citation score in comparison to the international level in the field, to the type of paper (e.g. article, review, and proceedings paper), since different documents types tend to have different citation characteristics, as well as to the specific years and fields in which the research units papers were published (1=world average). Example A unit of assessment has three publications, X, Y and Z Citations to publications c i X: 20 citations (Article, 2006, Physics) Y: 30 citations (Review, 2005, Biochemistry) Z: 40 citations (Article, 2005, Chemistry) World averages (e i ) Article, 2005, Physics: 20 Review, 2005, Biochemistry: 50 Article, 2005, Chemistry: 20 MNCS = 1/3 (20/20 + 30/50 + 40/20) = 1.2 The value 1.2 of MNCS means that papers published by the unit of assessment receive 20 percent more citations compared to papers published in the same field worldwide. The value of MNCS is categorized MNCS < 0.80 low 0.80 MNCS 1.20 average MNCS > 1.20 high 6 Source of formula for MNCS: Waltman et.al. 2011, 39. 17

MNJS Field-normalized average journal impact. It is based on the number of citations received by all publications in journals in which TUT publishes as compared to the average number of citations in the field. MNJS is calculated in the same way as MNCS (1=world average). NPHCP10 Field-normalized proportion of publications belonging to the top 10 percent in terms of citation impact (1=world average). Example Unit of assessment X has 100 papers in field A. The expected number of papers among the top 10 percent of the most frequently cited papers is 10. If the unit X has 20 papers among the top 10 percent of papers in the field A, then the unit X has two times more papers among the top 10 percent than expected in the field A. This value is normalized to the international level in the field. If the value of NPCHP10 exceeds 1, then the unit of assessment has more papers among the top 10 percent than the world average in the same field. Int_Cov (Internal Coverage) The indicator Internal Coverage Int_Cov gives the proportion of references indexed in WoS. In other words, the reference lists of TUT publications are analysed and the references classified into two categories: WoS and non-wos references. WoS references are mainly journal articles, and non-wos references can be proceedings papers, reports, books, etc. The indicator Int_Cov is calculated by dividing the number of WoS references with the number of all references. The degree of referring towards other WoS indexed literature indicates the importance of WoS indexed journal literature in the scientific communication process. Example Unit of assessment X has a paper in WoS with 5 references. Three of the references are WoS papers and one is non-wos. The Int_Cov indicator has a value of 3/4 = 75 percent. The value Int_Cov is categorized x > 80% excellent 60% < x < 80% very good 40% < x < 60% good x < 40% poor. Vitality The indicator measures the average age of references per unit of assessment. The indicator is normalized to the world average, which is 0 (for other indicators it is 1). All scores are between -1 and 1. PSC Percentage of self-citations 18

Appendix 3. Classification of journal categories into 15 disciplines Discipline/Main field Applied physics & chemistry Biological sciences primarily related to animals and plants Biological sciences primarily related to humans Chemistry Clinical medicine Economics Engineering Geosciences Humanities & arts Mathematics Molecular biology & biochemistry Other social sciences primarily related to medicine & health Other social sciences Physics & astronomy Psychology & psychiatry Other Source: Moed 2010, 189. Source items (%) Important journal categories included (nonexhaustive list) 10.3 15 categories, incl. applied physics, materials science, optics, chemical engineering, mechanics, applied chemistry, acoustics, instruments & instrumentation 6.6 16 categories, incl. plant sciences, ecology, zoology, marine & freshwater biology, veterinary sciences, agriculture, food science, biology 10.3 12 more basic oriented categories primarily related to humans, incl. neurosciences, pharmacology, immunology, endocrinology, microbiology, virology, medicine, research 9.6 General, physical, organic, inorganic & nuclear, analytical and electro-chemistry, polymer science 18.7 34 predominantly clinical categories, including oncology, medicine general, surgery, cardiology & cardiovascular system, gastroenterology 1.4 Economics, management, business 7.6 34 Engineering categories, incl. electrical eng, nuclear science and technol., mechanical eng, computer science 3.5 12 categories, incl. environmental sciences, geosciences, meteorology & atmospheric sciences, oceanography, geology, mineralogy 4.2 Law, literature, history, art, classics, language and linguistics, philosophy, archeology, poetry, dance, music 3.0 Mathematics, applied mathematics, statistics & probability, miscellaneous mathematics 7.0 Biochemistry & molecular biology, cell biology, biophysics, biotechnology, developmental biology, biochemical research methods 2.3 Public environment and occupational health, nursing, sport science, rehabilitation, substance abuse, family studies, geriatrics, health policy 3.1 Sociology, education, political sciences, anthropology, geography, internal relations 8.2 Atomic, molecular & chemical, condensed matter, nuclear, and mathematical physics, physics of particles and fields, and fluids 2.8 All categories related to psychology, psychiatry and behavioural sciences 1.8 Category multidisciplinary 19

Appendix 4. WoS subject fields in which TUT researchers have published and number of fractionalized papers (Pf) WoS Subject Field Pf Acoustics 5,3 Agricultural Engineering 2,3 Agriculture, Multidisciplinary 0,3 Anesthesiology 3,0 Astronomy & Astrophysics 28,0 Automation & Control Systems 19,3 Biochemical Research Methods 24,1 Biochemistry & Molecular Biology 29,3 Biology 18,8 Biophysics 35,5 Biotechnology & Applied Microbiology 41,1 Business 1,7 Cardiac & Cardiovascular Systems 10,5 Cell & Tissue Engineering 5,0 Cell Biology 10,3 Chemistry, Analytical 9,5 Chemistry, Applied 6,8 Chemistry, Inorganic & Nuclear 1,2 Chemistry, Multidisciplinary 41,3 Chemistry, Organic 15,7 Chemistry, Physical 114,2 Clinical Neurology 6,8 Computer Science, Artificial Intelligence 58,7 Computer Science, Cybernetics 1,7 Computer Science, Hardware & Architecture 10,1 Computer Science, Information Systems 26,6 Computer Science, Interdisciplinary Applications 20,7 Computer Science, Software Engineering 24,4 Computer Science, Theory & Methods 39,0 Construction & Building Technology 11,8 Crystallography 1,3 Dentistry, Oral Surgery & Medicine 3,0 Developmental Biology 0,5 Education & Educational Research 5,0 Education, Scientific Disciplines 0,5 Electrochemistry 6,2 Endocrinology & Metabolism 2,5 Energy & Fuels 15,2 Engineering, Aerospace 2,8 Engineering, Biomedical 49,6 Engineering, Chemical 19,9 Engineering, Civil 6,5 20

Engineering, Electrical & Electronic 309,0 Engineering, Environmental 15,2 Engineering, Geological 1,3 Engineering, Industrial 8,0 Engineering, Manufacturing 12,7 Engineering, Mechanical 19,8 Engineering, Multidisciplinary 13,6 Environmental Sciences 39,2 Environmental Studies 1,2 Ergonomics 6,8 Food Science & Technology 1,7 Gastroenterology & Hepatology 2,0 Genetics & Heredity 5,3 Geochemistry & Geophysics 2,8 Geography, Physical 0,5 Geosciences, Multidisciplinary 3,5 Geriatrics & Gerontology 0,5 Health Care Sciences & Services 9,8 Health Policy & Services 1,0 Hematology 3,8 Imaging Science & Photographic Technology 10,1 Immunology 3,0 Infectious Diseases 0,5 Information Science & Library Science 1,7 Instruments & Instrumentation 16,8 Management 8,7 Materials Science, Biomaterials 28,2 Materials Science, Ceramics 11,5 Materials Science, Characterization & Testing 3,5 Materials Science, Coatings & Films 17,3 Materials Science, Composites 4,0 Materials Science, Multidisciplinary 70,5 Materials Science, Paper & Wood 12,3 Materials Science, Textiles 5,7 Mathematical & Computational Biology 20,9 Mathematics 13,8 Mathematics, Applied 17,7 Mathematics, Interdisciplinary Applications 5,6 Mechanics 15,0 Medical Informatics 6,0 Medical Laboratory Technology 0,3 Medicine, General & Internal 5,5 Medicine, Research & Experimental 6,1 Metallurgy & Metallurgical Engineering 16,4 Meteorology & Atmospheric Sciences 13,3 Microbiology 12,8 Mineralogy 0,7 Mining & Mineral Processing 0,7 21

Multidisciplinary Sciences 21,5 Nanoscience & Nanotechnology 24,5 Neuroimaging 1,7 Neurosciences 14,5 Nuclear Science & Technology 11,5 Nursing 1,0 Nutrition & Dietetics 2,0 Obstetrics & Gynecology 1,5 Oncology 51,5 Operations Research & Management Science 11,4 Ophthalmology 3,0 Optics 162,8 Orthopedics 4,0 Otorhinolaryngology 1,0 Pathology 8,5 Pediatrics 0,8 Peripheral Vascular Disease 1,5 Pharmacology & Pharmacy 4,7 Physics, Applied 166,7 Physics, Atomic, Molecular & Chemical 47,5 Physics, Condensed Matter 109,7 Physics, Fluids & Plasmas 14,7 Physics, Mathematical 26,7 Physics, Multidisciplinary 85,0 Physics, Nuclear 0,3 Physics, Particles & Fields 0,8 Physiology 4,0 Polymer Science 40,5 Primary Health Care 0,7 Psychology, Applied 2,0 Psychology, Developmental 1,0 Psychology, Experimental 0,3 Psychology, Multidisciplinary 1,0 Public, Environmental & Occupational Health 7,8 Radiology, Nuclear Medicine & Medical Imaging 18,7 Rehabilitation 2,7 Religion 0,5 Remote Sensing 1,2 Robotics 1,7 Social Issues 0,5 Social Sciences, Interdisciplinary 0,6 Soil Science 1,0 Spectroscopy 5,0 Sport Sciences 1,3 Statistics & Probability 4,5 Surgery 15,8 Telecommunications 46,9 Thermodynamics 3,0 22

Toxicology 7,1 Transplantation 0,8 Transportation 0,3 Transportation Science & Technology 2,8 Urology & Nephrology 5,0 Water Resources 5,5 Zoology 1,5 Total 2484 23

References CWTS (2007). Scoping study on the use of bibliometric analysis to measure the quality of research in UK higher education institutions. Report to HEFCE by the Leiden group. November 2007. http://www.hefce.ac.uk/pubs/rdreports/2007/rd18_07/rd18_07.pdf Garfield, E. (1983). Citation indexing its theory and application in science, technology, and humanities. New York, Wiley. Karolinska Institutet Bibliometrics Project Group (2008). Bibliometrics. Publication Analysis as a Tool for Science Mapping and Research Assessment Stockholm, 10 p. http://ki.se/content/1/c6/01/79/31/introduction_to_bibliometrics_v1.3.pdf van Leeuwen, T.N., Visser, M.S., Moed, H.F., Nederhof, T.J. & Van Raan, A.F.J. (2003). The Holy Grail of science policy: Exploring and combining bibliometric tools in search of scientific excellence, Scientometrics, Vol. 57, No. 2, pp. 257-280. Moed, H.F. (2010). Citation Analysis in Research Evaluation. Dordrecht: Springer. Piro, F.N. (ed.) (2011). Comparing research at Nordic universities using bibliometric indicators. A publication from the NORIA-net «Bibliometric Indicators for the Nordic Universities». NordForsk Policy Briefs 4 2011. Oslo, NordForsk. http://www.nordforsk.org/files/rapp.bib.2011.pub_21.5.11. van Raan, A.F.J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, Vol. 62, No. 1, pp. 133-143. Sandström. U. (2009). Aalto University Bibliometric Report 2003-2007. http://www.aalto.fi/fi/research/rae/results/aalto_university_bibliometric_report_2003-2007.pdf Waltman, L., van Eck, N.J., van Leeuwen, T.N., Visser, M.S. & van Raan, A.F.J. (2011). Towards a new crown indicator: Some theoretical considerations, Journal of Informetrics, Vol. 5, No. 1, pp. 37-47. 24