Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

Similar documents
Complementary bibliometric analysis of the Educational Science (UV) research specialisation

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

PBL Netherlands Environmental Assessment Agency (PBL): Research performance analysis ( )

DISCOVERING JOURNALS Journal Selection & Evaluation

Experiences with a bibliometric indicator for performance-based funding of research institutions in Norway

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir

2013 Environmental Monitoring, Evaluation, and Protection (EMEP) Citation Analysis

Measuring the reach of your publications using Scopus

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

Where to present your results. V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science

What is bibliometrics?

Scopus Introduction, Enhancement, Management, Evaluation and Promotion

Bibliometrics and the Research Excellence Framework (REF)

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

Alphabetical co-authorship in the social sciences and humanities: evidence from a comprehensive local database 1

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library

The use of bibliometrics in the Italian Research Evaluation exercises

Web of Science Unlock the full potential of research discovery

Cited Publications 1 (ISI Indexed) (6 Apr 2012)

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

Suggested Publication Categories for a Research Publications Database. Introduction

Citation analysis: Web of science, scopus. Masoud Mohammadi Golestan University of Medical Sciences Information Management and Research Network

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

STRATEGY TOWARDS HIGH IMPACT JOURNAL

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

ISSN: ISO 9001:2008 Certified International Journal of Engineering Science and Innovative Technology (IJESIT) Volume 3, Issue 2, March 2014

Bibliometric analysis of the field of folksonomy research

What is Web of Science Core Collection? Thomson Reuters Journal Selection Process for Web of Science

Results of the bibliometric study on the Faculty of Veterinary Medicine of the Utrecht University

InCites Indicators Handbook

Scopus. Dénes Kocsis PhD Elsevier freelance trainer

Scopus Journal FAQs: Helping to improve the submission & success process for Editors & Publishers

Citation-Based Indices of Scholarly Impact: Databases and Norms

arxiv: v1 [cs.dl] 8 Oct 2014

Your research footprint:

Daniel Torres-Salinas EC3. Univ de Navarra and Unv Granada Henk F. Moed CWTS. Leiden University

Workshop Training Materials

Research Output Policy 2015 and DHET Communication: A Summary

Rawal Medical Journal An Analysis of Citation Pattern

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS

STI 2018 Conference Proceedings

UNDERSTANDING JOURNAL METRICS

An Introduction to Bibliometrics Ciarán Quinn

Focus on bibliometrics and altmetrics

POLICY AND PROCEDURES FOR MEASUREMENT OF RESEARCH OUTPUT OF PUBLIC HIGHER EDUCATION INSTITUTIONS MINISTRY OF EDUCATION

EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS

Scopus in Research Work

Aalborg Universitet. Scaling Analysis of Author Level Bibliometric Indicators Wildgaard, Lorna; Larsen, Birger. Published in: STI 2014 Leiden

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database

Elsevier Databases Training

Research Evaluation Metrics. Gali Halevi, MLS, PhD Chief Director Mount Sinai Health System Libraries Assistant Professor Department of Medicine

Managing your assets in the publication economy

Developing library services to support Research and Development (R&D): The journey to developing relationships.

The digital revolution and the future of scientific publishing or Why ERSA's journal REGION is open access

The real deal! Applying bibliometrics in research assessment and management...

Citation Analysis in Research Evaluation

Bibliometric report

BIBLIOMETRIC REPORT. Netherlands Bureau for Economic Policy Analysis (CPB) research performance analysis ( ) October 6 th, 2015

Scientific Quality Assurance by Interactive Peer Review & Public Discussion

Impact Factors: Scientific Assessment by Numbers

Research Playing the impact game how to improve your visibility. Helmien van den Berg Economic and Management Sciences Library 7 th May 2013

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical)

Indexing in Databases. Roya Daneshmand Kowsar Medical Institute

Journal Citation Reports on the Web. Don Sechler Customer Education Science and Scholarly Research

Promoting your journal for maximum impact

Bibliometrics & Research Impact Measures

A bibliometric analysis of the Journal of Academic Librarianship for the period of

The problems of field-normalization of bibliometric data and comparison among research institutions: Recent Developments

BIG DATA IN RESEARCH IMPACT AMINE TRIKI CUSTOMER EDUCATION SPECIALIST DECEMBER 2017

INTERNATIONAL JOURNAL OF EDUCATIONAL EXCELLENCE (IJEE)

Bibliometric measures for research evaluation

Establishing Eligibility As an Outstanding Professor or Researcher 8 C.F.R (i)(3)(i)

Suggestor.step.scopus.com/suggestTitle.cfm 1

Swedish Research Council. SE Stockholm

Gandhian Philosophy and Literature: A Citation Study of Gandhi Marg

Calver, M.C. (2016) Reflections on two years of change at Pacific Conservation Biology. Pacific Conservation Biology, 22 (4). p

Vol. 48, No.1, February

The Impact Factor and other bibliometric indicators Key indicators of journal citation impact

Ari Fahrial Syam Faculty of Medicine, Universitas Indonesia

ICI JOURNALS MASTER LIST Detailed Report for 2017

Presentation from the EISZ Conference The use and generation of scientific content. Roles for Libraries in Budapest, Hungary Sep 12 th, 2016

About journal BRODOGRADNJA(SHIPBUILDING)

Scientometric and Webometric Methods

Scientometrics & Altmetrics

Measuring Academic Impact

University of Liverpool Library. Introduction to Journal Bibliometrics and Research Impact. Contents

China s Overwhelming Contribution to Scientific Publications

How to publish your results

How to publish your results

ELSEVIER DATABASES USER TRAINING AND UPDATES. Presented by Ozge Sertdemir October 2017

Enrichment process for title submission in STEP

AN INTRODUCTION TO BIBLIOMETRICS

Electronic Research Archive of Blekinge Institute of Technology

The Financial Counseling and Planning Indexing Project: Establishing a Correlation Between Indexing, Total Citations, and Library Holdings

Towards a Bibliometric Database for the Social Sciences and Humanities A European Scoping Project

Transcription:

April 28th, 2014 Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation Per Nyström, librarian Mälardalen University Library per.nystrom@mdh.se +46 (0)21 101 637 Viktor Öman, librarian Mälardalen University Library viktor.oman@mdh.se +46 (0)16 153 727

Contents Introduction and general conditions... 3 Descriptive indicators... 4 Productivity... 5 Impact/prestige... 6 Collaboration... 7 Collaboration maps... 8 Sub-environment indicators... 10 2

Introduction and general conditions The bibliometric part of MER14 is an attempt to statistically assess the performance of the individual research specialisations in terms of their publication activities. The present report acts as a complement to the Bibliometric analysis of Mälardalen University carried out by the Centre for Science and Technology Studies at Leiden University (CWTS, 2014). As the reach of metrics based solely on articles in academic journals may be considered too limited for some subject fields, we will compile key figures from the CWTS analysis and present them alongside additional indicators based on other, more inclusive sources of bibliometric data. Thus we hope to provide the reader with complementary (and sometimes competing) perspectives on the research specialisations publications. Keeping in line with the general outlay of MER14, the basis of this analysis are publications authored by researchers belonging the research specialisations' current roster, published between 2008 and 2013. Publications by former members are thus not included, nor are publications where the contribution of the research specialisations' researchers is solely editorial. Texts not yet published (manuscripts, preprints), oral presentations and posters are also excluded. The analysis will focus on three dimensions of the research specialisations' publication output: productivity, impact/prestige & collaboration. In trying to capture each of these, we will rely on the following databases: DiVA, the Mälardalen University publication repository. Registration in DiVA is mandatory for university employees. For MER14, we've also asked researchers to register publications published while employed elsewhere. DiVA contains mostly scientific material (both refereed and not refereed), as well as some non-scientific publications (popular science, opinion pieces, etcetera). Web of Science (WoS), a collection of databases provided by Thompson Reuters. For this analysis, we've used the five citation indexes 1 containing citation data for both journal articles and conference proceedings. Scopus, provided by Elsevier, contains citation data for serial publications (journals, conference proceeding and book series). The CWTS Citation Index (CI), provided by the CWTS at Leiden University. Based on data from the three main parts 2 of WoS, it contains citation data only for the journals covered in WoS, but includes comparison data for different subject fields. All CI-based indicators in this reports is taken from CWTS (2014), which also contains an in-depth methodological discussion on how those numbers are arrived at. The Norwegian list, which is used nationally for performance based allocation of research funds in Norway (and locally by some Swedish universities). This is a register of academic journals, series, websites and book publishers, ranked according to perceived prestige, with Level 2 being the most prestigious. The value of the bibliometric indicators will depend on how well the above data sources reflect publication traditions and norms of the subject fields in which a research specialisation is active. Because of this, we'll begin by giving a descriptive account of the Health and Welfare (HV) research specialisation's publications and their coverage in these databases. 3

Descriptive indicators As DiVA contains virtually all publications by current Mälardalen University researchers from the chosen period, it's a natural starting point for this analysis. Below we see the total number of publications in DiVA authored by at least one researcher from the HV research specialisation, as well as the distribution over time and the distribution of publication types 3 : Year Number of publications 2008 92 2009 125 2010 149 2011 120 2012 140 2013 149 Total 775 Table 1. HV DiVA publications Other publication types; 3,5% Report; 5,4% Book; 2,2% Book chapter; 13,0% Conference paper; 17,2% Journal article; 58,7% Figure 1. HV DiVA publication types Based on the publication information in DiVA, we've searched the other databases utilized in the bibliometric analysis. 4 Because of citation delay, the publication year time span will be shorter for the databases containing citation data. The table below shows the number of the HV research specialisation publications covered in each of these databases, as well as what percentage of the research specialisations publications in DiVA that coverage represents (for corresponding publication years). Database Number of publications Coverage WoS (08-12) 231 36,9 % Scopus (08-12) 234 37,4 % CI (08-12) 215 34,3 % Norwegian list (08-13) 463 59,7 % Table 2. HV publication coverage 4

The coverage percentages presented here corresponds in principle to the external coverage calculated by CWTS (2014, p. 8), though while they measured only against journal articles and reviews, we use the research specialisation s entire DiVA output as baseline. It could be argued that not all publications in DiVA are equally important, especially not when trying to assess scientific impact or prestige. And of course none of the databases above are intended to cover everything researchers publish. The Norwegian list, for instance, only contains publishers that conduct formal peer review and that do not function as an outlet primarily for a single institution (Norwegian Association of Higher Education Institutions, n.d.) 5. However, not all inclusion criteria will be entirely quality based. Especially the focus on articles in English language journals in some of the databases might be problematic if a research specialisation publishes in subject field where publications in other mediums and/or languages are considered important. To facilitate the interpretation of the bibliometric indicators, they will all be marked with the name of database on which they are based (either individually or table-wise). For indicators based on sources other than DiVA, the reader is advised to consult table 2 to put the results in perspective. Productivity The sheer number of publications from a research specialisation will be largely dependent on its size and the chosen time span, so all indicators on publication productivity will be presented divided per researcher and year. We have no records of any hiatuses from research activities that may have occurred during this period, so all researchers are (probably incorrectly) assumed to have been at the disposal of the research specialisation, or their former research groups, for the entire time. Though as the amount of time allocated for research differs depending on staff position, we'll give a separate account of publications authored by senior staff, who generally have more time to engage in research. 6 Furthermore, co-authoring reduces the amount of time and work each researcher has to invest in a publication. We therefore also look at HV researchers' fractionalized share of publications: If, for example, a researcher from the research specialisation co-authors a publication together with two external authors, the research specialisation is credited for a third of that publication (that is, every author is assumed to have contributed equally to the publication). Number of publications Publications, fractionalized Per researcher and year 0,7 0,4 Per senior researcher and year 1,5 0,8 Table 3. HV productivity indicators. Based on DiVA publications 08-13 Table 3 shows that during the period of 08-13, HV researchers (counting both senior and non-senior ones) authored an average of 0,7 publications per year, and their fractionalized share of publications was 0,4 per researcher and year. Looking only at publications authored by senior researcher, both these productivity indicators are about twice as high. It should be noted that these figures do not take into consideration publication type. It would be reasonable to expect, for instance, that a research specialisation publishing many books 5

would produce fewer publications per researcher than one that publishes almost exclusively journal articles. However, as this would be difficult to quantify, we have not attempted to adjust for such differences. Impact/prestige When trying to assess the quality of research using bibliometric methods, focus tends to be on either the impact of publications, measured by the citations they receive, or the perceived prestige of the channels that have accepted the manuscript for publication. In both cases, measures are usually constructed to assess recognition from within the scientific community, and other aspects like societal influence falls outside the scope of analysis. The most frequently used sources for citation data are Web of Science and Scopus. Here, we will look at the average number of citations (including self-citations) per publication and year in each of these two databases. As citation frequency varies between subject fields, this indicator should not be used for interdisciplinary comparisons, but could be useful for benchmarking against similarly oriented groups of researchers. For a more fine-tuned measure of citation impact, we've asked the CWTS in Leiden to calculate the mean field-normalized citation score (MNCS). This is a relative indicator: A MNCS value of 1.2, for example, would mean that on average, the research specialisation's articles are cited 20 % more frequently than similar articles (taking into account subject field and publication year). The percentage of articles among the 20 % most highly cited (PPtop20%) is also based on comparison of similar articles. Both also exclude self-citations. HV impact indicators Citations/publication & year (WoS, 08-12) 1,1 Citations/publication & year (Scopus, 08-12) 1,4 MNCS (CI, 08-12) 0,8 PPtop20% (CI, 08-12) 14 % Table 4. HV impact indicators. CI data from CWTS report (2014) Table 4 shows that HV publications available in WoS have received an average of 1,1 citations per year from other WoS publications, while for publications covered by Scopus the average is 1,4 citations per year. (Scopus is a larger database than WoS, so it often yields a somewhat higher citation count.) Furthermore, we see that that the publications from the HV research specialisation available in the CI database (i.e., articles in WoS journals) have received fewer citations (MNCS < 1) than other articles of the same kind. HV articles also appear among the top 20 % most cited articles less frequently than the 20 % average. If we turn instead to the prestige of the channels in which the research specialisation publishes, we'll first look at the mean normalized journal score (MNJS). This is calculated by the CWTS using the same principles as the MNCS, but applied to journals instead of individual publications. As journal and article impact does not always correspond, this is more an indication of publication prestige, measuring how influential the journals where the research specialisation publishes are. A MNJS value of 1.2 would mean the research 6

specialisation publishes in journals that, on average, receive 20 % more citations than other comparable journals. Another indicator of prestige is the percentage of publications in Norwegian Level 2 channels. These are publishers deemed leading through an academic approval process, set to comprise roughly one-fifth of the publications produced by an academic or research field (Norwegian Association of Higher Education Institutions, n.d., para. 1.5). This means that a percentage higher than twenty would indicate that the research specialisation publishes in prestigious channels more frequently than the average researcher. HV prestige indicators MNJS (CI, 08-12) 0,97 Level 2 publications (Norwegian list, 08-13) 19,4 % Table 5. HV prestige indicators Table 5 shows that, on the whole, the HV research specialisation publishes in CI journals (i.e., WoS journals) that have an average citation impact (MNJS close to 1). When publishing in channels regarded as scholarly by the Norwegian definition, the HV research specialisation are accepted in the prestigious Level 2 channels to the same extent as others in their subject field, that is, roughly 20 % of the time. Collaboration The extent to which the research specialisation's researchers authors publications in collaboration with others can easily be tracked in DiVA, where all authors of a publication is listed. Based on DiVA, we present the average number of authors for the HV research specialisation's publications, as well as the percentage of single author publications. HV collaboration indicators, co-authoring Authors/publication 3,1 Single author publications 25,2 % Table 6. HV collaboration indicators, co-authoring. Based on DiVA publications 08-13 Table 6 shows that the average HV publication is authored by just over three researches, and that single author publications account for 25,2 % of its output. Information on the affiliation of co-authors is, however, largely missing in DiVA. So to be able to tell how common it is for the HV research specialisation to engage in interorganizational and international collaboration, we have once again turned to CWTS for indicators: No collaboration/single institution means that all of a publication's authors are from a single institution. National collaboration means that a publication is co-authored by researchers from two or more institutions, all within the same country. And International collaboration, subsequently, means that researchers affiliated with institutions from at least two different countries have co-authored a publication. (These categories are mutually exclusive, meaning that a publication authored in both national and international collaboration will be classified as International collaboration.) 7

HV collaboration indicators, affiliations No collaboration/single institution 18,14 % National collaboration 52,56 % International collaboration 29,3 % Table 7. HV collaboration indicators, affiliations. Based on CI publications 08-12 (CWTS, 2014) Table 7 shows that when publishing in CI journals (i.e., WoS journals), more than half of HV publications are authored in national collaboration, and about 30 % of publications are coauthored with researchers from other countries. Collaboration maps The map shows the names of organizations that appear in the address-fields in the articles in which researchers at HV have collaborated with other authors, found in WoS 2009-2013. Thus, each address that appears in the map means an organization with which the HVresearcher, either have been affiliated, or with which she or he have been collaborated. Therefore, the map gives an idea of the various networks in which the actual researchers at HV historically have been involved. The size of the circles depends on the number of times the organization occurs. Circles with same colors belong to the same cluster. The full HV collaboration network contains 251 organizations. In order to make the map more readable only organizations that occur in more than 3 documents are visible in the map. Of the 251 organizations 52 meet the threshold. All these organizations are bibliographically connected and appear in the map. 8

Figure 2. HV collaboration map. Based on WoS, 08-13 9

The table below shows the number of documents in which the names of the ten most frequently appearing organizations are found. Affiliation Number of documents malardalen univ 177 karolinska inst 116 uppsala univ 85 univ orebro 26 univ gothenburg 15 vaxjo univ 13 jonkoping univ 11 univ ghent 11 univ zaragoza 11 stockholm univ 10 Table 8. HV publications, most frequently appearing affiliations. Based on WoS 08-13 Sub-environment indicators The following tables present most of the above indicators at the sub-environment level. We do not have access to CI-data for sub-environments, so indicators based on that source are not included. For certain sub-environments, some indicators are based on too few publications for the result to be considered stable, so they might be perceived as less meaningful. Indicators based on less than 30 publications will be shaded grey. The same publication time spans are used here as above: 08-13 for DIVA and the Norwegian list, and 08-12 for WoS and Scopus. Where researchers are dividing their time between subenvironments, per researcher -indicators have been adjusted accordingly. Publications from these researchers will be counted once for each sub-environment, meaning that the total number of publications for the sub-environments combined may exceed that of the research specialisation. DiVA pub, DiVA pub, No of DiVA pub/ DiVA pub/ Sub- fraction./ fraction./ DiVA pub researcher senior researcher & year environment researcher senior researcher & year 08-13 & year & year Caring Sciences 276 0,6 0,3 1,8 1,0 Health Care Education 24 0,5 0,4 1,7 1,3 Medical Science 7 0,1 <0,1 0,2 <0,1 Physiotherapy 65 0,7 0,3 1,5 0,6 Psychology 103 0,9 0,4 1,2 0,5 Public Health Sciences 126 1,3 0,5 2,8 0,9 Social Work 125 1,2 0,7 2,1 1,3 Sociology 72 0,8 0,6 1,2 0,9 Table 9. HV sub-environments productivity indicators. Grey figures = <30 publications 10

No of Citations/ No of Citations/ No of Norwegian List Sub- % Norwegian WoS pub WoS pub Scopus pub Scopus pub environment list, Level 2 08-12 & year 08-12 & year pub 08-13 Caring Sciences 104 0,8 110 1,1 193 21,2 % Health Care Education 5 0,3 5 0,5 14 0 % Medical Science 4 0,4 3 1,1 6 0 % Physiotherapy 27 1,8 33 2,0 47 14,9 % Psychology 34 1,1 33 1,7 67 16,4 % Public Health Sciences 50 1,3 34 2,1 76 28,9 % Social Work 15 0,8 16 0,8 46 17,4 % Sociology 1 0,3 5 0,5 27 7,4 % Table 10. HV sub-environments impact/prestige indicators. Grey figures = <30 publications Sub-environment Authors/DiVA pub % Single author DiVA pub Caring Sciences 2,8 25,4 % Health Care Education 1,7 54,2 % Medical Science 4,7 0 % Physiotherapy 4,3 9,2 % Psychology 3,4 14,6 % Public Health Sciences 5,0 6,3 % Social Work 2,4 36,0 % Sociology 1,9 52,8 % Table 11. HV sub-environments collaboration indicators. Grey figures = <30 publications 11

Notes 1 The WoS databases used here are: Science Citation Index Expanded, Social Sciences Citation Index, Arts & Humanities Citation Index, Conference Proceedings Citation Index- Science, and Conference Proceedings Citation Index- Social Science & Humanities 2 The CI database contains data from Science Citation Index Expanded, Social Sciences Citation Index, and Arts & Humanities Citation Index 3 For a complete list of publication types occurring in the DiVA data, see CWTS (2014, p. 25). Note that the category Other publication types presented here may include, but is not limited to, the DiVA category Other. 4 The Norwegian list can be searched here: https://dbh.nsd.uib.no/publiseringskanaler/forside?request_locale=en. A list of which sources are included in the different parts of WoS and in Scopus can be found here: http://www.kth.se/kthb/publicering/bibliometri/faq-biblimetrics/faq/which-journals-andconference-proceedings-are-covered-by-web-of-science-and-scopus-1.378647 5 The Norwegian model for funding allocation also require that individual publications present new insight and are presented in a form that allows the research findings to be verified and/or used in new research activity for them to be regarded as academic (Norwegian Association of Higher Education Institutions, n.d., para. 3.2). This categorization is not easily applicable to DiVA data, as MDH publications obviously haven t been registered with the Norwegian definition of scholarliness in mind. For this analysis, only publications registered in DiVA as having Refereed or Other scientific content have been considered for the Norwegian list indicators. 6 The division between senior and non-senior staff used here may not correspond entirely with which researchers were considered senior in the research specialisations' self-evaluations, as we've based it solely on staff categories. For the bibliometric analysis, the following were categorized as senior: Professors (Professors, promoted senior lecturers); Adjunct professors; Visiting professors; Manager; Senior lecturers (Senior lecturers, promoted lecturers); Adjunct senior lecturers; Researchers; Research engineers; Associate senior lecturers References Center for Science and Technology Studies (2014). Bibliometric analysis of Mälardalen University. Leiden: Leiden University. Norwegian Association of Higher Education Institutions (n.d.). A Bibliometric Model for Performance-based Budgeting of Research Institutions. Oslo: Norwegian Association of Higher Education Institutions. Retrieved from http://www.uhr.no/documents/rapport_fra_uhr_prosjektet_4_11_engcjs_ende lig_versjon_av_hele_oversettelsen.pdf 12