Visability and Impact of Research: Bibliometric Services for University Management and Academic Staff

Similar documents
Bibliometric practices and activities at the University of Vienna

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

Citation analysis: State of the art, good practices, and future developments

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir

Global Journal of Engineering Science and Research Management

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier

DISCOVERING JOURNALS Journal Selection & Evaluation

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

Demystifying Citation Metrics. Michael Ladisch Pacific Libraries

A Correlation Analysis of Normalized Indicators of Citation

An Introduction to Bibliometrics Ciarán Quinn

Measuring the reach of your publications using Scopus

PUBLICATION OF RESEARCH RESULTS

Making Hard Choices: Using Data to Make Collections Decisions

Bibliometric Analysis of Literature Published in Emerald Journals on Cloud Computing

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

Citations count: the provision of bibliometrics training by university libraries

DOWNLOAD PDF 2000 MLA INTERNATIONAL BIBLIOGRAPHY OF BOOKS AND ARTICLES ON THE MODERN LANGUAGE AND LITERATURES

Bibliometrics and the Research Excellence Framework (REF)

AN INTRODUCTION TO BIBLIOMETRICS

Scientometrics & Altmetrics

Enabling editors through machine learning

What is academic literature? Dr. B. Pochet Gembloux Agro-Bio Tech Liège university (Belgium)

researchtrends IN THIS ISSUE: Did you know? Scientometrics from past to present Focus on Turkey: the influence of policy on research output

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

Accpeted for publication in the Journal of Korean Medical Science (JKMS)

Journal Citation Reports Your gateway to find the most relevant and impactful journals. Subhasree A. Nag, PhD Solution consultant

Scopus in Research Work

Citation-Based Indices of Scholarly Impact: Databases and Norms

Bibliometrics & Research Impact Measures

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

Your research footprint:

Where to present your results. V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science

SEARCH about SCIENCE: databases, personal ID and evaluation

Publishing research. Antoni Martínez Ballesté PID_

6. Institutional Planning and Budgeting Processes

Web of Science Unlock the full potential of research discovery

WHO S CITING YOU? TRACKING THE IMPACT OF YOUR RESEARCH PRACTICAL PROFESSOR WORKSHOPS MISSISSIPPI STATE UNIVERSITY LIBRARIES

Usage versus citation indicators

Citation Metrics. From the SelectedWorks of Anne Rauh. Anne E. Rauh, Syracuse University Linda M. Galloway, Syracuse University.

Scopus Introduction, Enhancement, Management, Evaluation and Promotion

Life Sciences sales and marketing

The digital revolution and the future of scientific publishing or Why ERSA's journal REGION is open access

CITATION INDEX AND ANALYSIS DATABASES

The 2016 Altmetrics Workshop (Bucharest, 27 September, 2016) Moving beyond counts: integrating context

Building Your DLP Strategy & Process. Whitepaper

Introduction. Status quo AUTHOR IDENTIFIER OVERVIEW. by Martin Fenner

Bibliometric glossary

Institutes of Technology: Frequently Asked Questions

UNDERSTANDING JOURNAL METRICS

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library

Journal of Undergraduate Research at Minnesota State University, Mankato

Measuring Your Research Impact: Citation and Altmetrics Tools

Practice with PoP: How to use Publish or Perish effectively? Professor Anne-Wil Harzing Middlesex University

AUDIENCE: ON DEMAND Maximising Audience; Platforms and Potential

Vice President, Development League of American Orchestras

Readership Count and Its Association with Citation: A Case Study of Mendeley Reference Manager Software

Impact Factors: Scientific Assessment by Numbers

Promoting your journal for maximum impact

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

Quality assessments permeate the

WHAT CAN WE LEARN FROM ACADEMIC IMPACT: A SHORT INTRODUCTION

esss european summer school for scientometrics 2013 Prof. Dr. Hans-Dieter Daniel

POLICY AND PROCEDURES FOR MEASUREMENT OF RESEARCH OUTPUT OF PUBLIC HIGHER EDUCATION INSTITUTIONS MINISTRY OF EDUCATION

DOWNLOAD PDF BOWKER ANNUAL LIBRARY AND TRADE ALMANAC 2005

Scopus Journal FAQs: Helping to improve the submission & success process for Editors & Publishers

Academic Identity: an Overview. Mr. P. Kannan, Scientist C (LS)

Scopus. Dénes Kocsis PhD Elsevier freelance trainer

Measuring Research Impact of Library and Information Science Journals: Citation verses Altmetrics

Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

The changing role of the subject specialist Presentation at the Liber Annual Conference, Warszawa, July 2007 (last version)

Where Should I Publish? Margaret Davies Associate Head, Research Education, Humanities and Law

Complementary bibliometric analysis of the Educational Science (UV) research specialisation

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

The use of bibliometrics in the Italian Research Evaluation exercises

CITATION ANALYSES OF DOCTORAL DISSERTATION OF PUBLIC ADMINISTRATION: A STUDY OF PANJAB UNIVERSITY, CHANDIGARH

Scientomentric Analysis of Library Trends Journal ( ) Using Scopus Database

History, Reputation Management, and Value: Discussing the Merits for

CONTRIBUTION OF INDIAN AUTHORS IN WEB OF SCIENCE: BIBLIOMETRIC ANALYSIS OF ARTS & HUMANITIES CITATION INDEX (A&HCI)

Research Output Policy 2015 and DHET Communication: A Summary

Research Impact Measures The Times They Are A Changin'

Authors attitudes to, and awareness and use of, a university institutional repository

Open Access Essentials

Influence of Discovery Search Tools on Science and Engineering e-books Usage

The Most Important Findings of the 2015 Music Industry Report

GCSE Teacher Guidance on the Music Industry Music

Academic Program Review Report: Highlights School of Music July 2011

Definitive Programme Document: Creative Writing (Bachelor s with Honours)

CITATION CLASSES 1 : A NOVEL INDICATOR BASE TO CLASSIFY SCIENTIFIC OUTPUT

Citation for the original published paper (version of record):

Analysing and Mapping Cited Works: Citation Behaviour of Filipino Faculty and Researchers

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

InCites Indicators Handbook

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by

RESEARCH TRENDS IN INFORMATION LITERACY: A BIBLIOMETRIC STUDY

How to Choose the Right Journal? Navigating today s Scientific Publishing Environment

On the Citation Advantage of linking to data

European Commission 7th Framework Programme SP4 - Capacities Science in Society 2010 Grant Agreement:

How implementation of bibliometric practice affects the role of academic libraries

Transcription:

Purdue University Purdue e-pubs Proceedings of the IATUL Conferences 2016 IATUL Proceedings Visability and Impact of Research: Bibliometric Services for University Management and Academic Staff Caroline Leiss Technical University of Munich (München, Germany) Kathleen Gregory Technical University of Munich Caroline Leiss and Kathleen Gregory, "Visability and Impact of Research: Bibliometric Services for University Management and Academic Staff." Proceedings of the IATUL Conferences. Paper 3. http://docs.lib.purdue.edu/iatul/2016/plenary/3 This document has been made available through Purdue e-pubs, a service of the Purdue University Libraries. Please contact epubs@purdue.edu for additional information.

VISIBILITY AND IMPACT OF RESEARCH: BIBLIOMETRIC SERVICES FOR UNIVERSITY MANAGEMENT AND RESEARCHERS Dr. Caroline Leiss Technical University of Munich, University Library, Germany caroline.leiss@ub.tum.de Kathleen Gregory Technical University of Munich, University Library, Germany kathleen.gregory@ub.tum.de Abstract University libraries must adapt to an academic landscape that is increasingly competitive and focused on assessment. As researchers and universities seek new ways to demonstrate their value and differentiate themselves, librarians are carving out new roles in research support and university evaluation. Helping researchers and university leadership to better understand and apply bibliometric data plays an important role in deepening the data information and scholarly publishing literacies at the entire institution, as well as in ensuring that bibliometric data are appropriately used in evaluative processes. The University Library at the Technical University of Munich has developed a portfolio of bibliometric services designed to help researchers, university administration and university leadership understand the meaning, applications, and limitations of bibliometric data, as they seek to improve the visibility and impact of their own work and that of the university as a whole. The Library s current service profile includes a comprehensive course, a consultation service for bibliometrics and research impact, and close collaboration with university departments such as the Offices of Evaluation and Faculty Recruitment to integrate bibliometric analyses into personnel and strategic decisions. This paper presents the conception and implementation of the University Library s bibliometric services and serves as an important resource for any library wishing to develop bibliometric or research support services at their institution. Introduction The landscape of higher education and academic research is becoming increasingly competitive. Funding opportunities are limited, and competition for grant monies is intense (Van Noorden R, Brumfel G, 2010). Governments are developing initiatives to improve national research profiles and to compete in the international research arena ( Research Excellence Framework, 2015; Australian Research Council; Deutsche Forschungsgemeinschaft). Academic rankings such as the Times Higher Education World University Rankings are taking on new importance (Bornmann, 2014), and the evaluation of universities, faculties, and individual researchers is becoming routine (Keller, 2015). The increased focus on assessment coupled with the increasingly competitive research environment is forcing both institutions and researchers to seek ways to demonstrate the value of their work. Bibliometric reports and indicators play a critical role in the assessment of university research. National assessments, such as the Research Excellence Framework in the United Kingdom and the Excellence in Research for Australia, incorporate bibliometric data in their evaluations (MacColl, 2010; Gibbs & Sergeant, 2009), and bibliometric analyses are increasingly

used in countries throughout the world to inform the allocation of financial resources (Ball & Tunger, 2006). Individual researchers are often also required to use bibliometric indicators such as the h- index, citation counts, and the journal impact factor in grant applications and tenure packages (Bladek, 2014; Hendrix, 2010). At the same time, a broader definition of research impact is emerging to include contributions that research makes not only to academia, but also to areas such as society, the economy, and the development of public policy (Australian Research Council). However, quantitatively measuring aspects related to the societal impact of research is still a nebulous and difficult endeavor (Bornmann, 2013; Given, Kelly, & Willson, 2015). Further limitations of traditional bibliometric indicators (e.g. the journal impact factor and the h-index) are well-documented (Haustein & Larivière, 2015), and there has been a growing movement in recent years to advocate for the correct application and use of bibliometrics in the evaluation of research (Hicks, Wouters, Waltman, Rijcke, & Rafols, 2015; San Francisco Declaration on Research Assessment ). Despite their limitations, when used appropriately and in conjunction with qualitative methods such as peer review, bibliometrics can provide additional and valuable information about the impact of research. In order to better support researchers and institutions, academic libraries are expanding their services and developing dedicated support for bibliometrics. According to Ball & Tunger (Ball & Tunger, 2006), libraries provide a natural home for bibliometric services. Although there is a dearth of formal bibliometric training in library and information science education (Zhao, 2011), Gumpenberger contends that academic librarians already possess many of the necessary skills, such as in-depth knowledge of major citation databases and experience in finding and analyzing information objectively (Gumpenberger, Wieland, & Gorraiz, 2012). Librarians expertise in developing new forms of information literacy instruction buttresses their ability to develop bibliometric education. Scholarly publishing literacy and data literacy are two extensions of information literacy that librarians are currently integrating into their instruction programs (Calzada Prado & Marzal, 2013; Wright, Fosmire, Jeffryes, Stowell Bracke, & and Westra, 2012; Zhao, 2014). According to Zhao, scholarly publishing literacy prepares and equips researchers for the current dynamic scholarly publishing environment and includes such skills as understanding the characteristics and rankings of journals within a discipline and using new forms of media to communicate scientific research (Zhao, 2014). While the term data literacy is often used to refer to research data, Calzado Prado and Marzal identify a series of broader competencies that characterize data literacy. This set of core skills includes the ability to understand, locate, interpret, evaluate, and ethically use data (Calzada Prado & Marzal, 2013). The skillsets described in both scholarly publishing literacy and data literacy are necessary when working with bibliometric data. Expanding instruction programs to include bibliometric literacy is a logical next step and can also help to deepen these other forms of literacy. Capitalizing on librarians existing skills, the University Library at the Technical University of Munich has developed a portfolio of bibliometric services. These services are designed to help researchers, university administration and university leadership appropriately use bibliometric data as they seek to demonstrate research impact and improve the visibility of the work of the entire university. This paper discusses the development of these services and addresses the general question of how an academic library can support a large research institution with bibliometric data. Bibliometrics and Research Support Providing support for a university s research activities, particularly in a competitive international environment, is a defining characteristic of the modern academic library (Gumpenberger et al., 2012). Bibliometric services are an important cornerstone in supporting researchers and the entire university at nearly every step in the research lifecycle. (See Figure 1, created by the authors after the "JISC Research Life Cycle," 2014).

Figure 1: Bibliometric Data in the Research Lifecycle Citation information and analysis can be used in searching the literature, which is critical to the development of research questions and the generation of new ideas. Bibliometric data can also help to identify leading researchers within a field, potential collaborators, or funding opportunities. Researchers are often required to use bibliometric indicators and citation information in grant and project proposals. Correctly using and interpreting journal-level bibliometric indicators is a critical component of the publication process. Other factors come into play when considering bibliometric support through the broader lens of research impact (See Figure 2, developed by the authors after the University of New South Wales model of research impact (University of New South Wales, 2016)). As in the research lifecycle, bibliometric information plays a key role in the selection of appropriate publication venues and developing strategies that can increase the visibility of publications. Creating and maintaining unique author identifiers (such as ORCID and ResearcherID) is critical in ensuring that an author s work is correctly attributed. Correct attribution, in turn, improves the accuracy of calculated bibliometric indicators. Metrics, both traditional and alternative, can be used to measure impact and the attention that an article or author receives. Assisting with understanding how metrics should and should not be used to demonstrate the value of scientific work is another key service that libraries can provide.

Figure 2: Research Impact Model A large degree of overlap exists between bibliometric services designed to demonstrate impact and other research support services in areas such as scholarly communication and research data management. Developing publication strategies, for example, can lead to discussions of the benefits of open access publication or the proliferation of predatory publishers. Likewise, as research data is being shared and published, discussions have emerged about best practices in data citation (Altman & Crosas, 2013). Studies have also suggested that open access publications and those that include research data are cited more frequently, demonstrating another link between these different research support services (Norris, Oppenheim, & Rowland, 2008; Piwowar, Day, Fridsma, & Ioannidis, 2007). Bibliometric Services at Libraries Libraries throughout the world have begun to develop bibliometric services (Drummond & Wartho, 2009). In a survey of 140 libraries in Australia, New Zealand, Ireland, and the United Kingdom, Corrall, Kennan, & Afzal found that the majority of responding libraries offered some type of bibliometric services. Bibliometric training was the most common service offered, followed by the provision of citation reports and calculations of research impact (Corrall, Kennan, & Afzal, 2013). While Australian libraries such as the University of New South Wales began by offering very comprehensive citation reports to researchers and faculties (Drummond & Wartho, 2009), the focus of many libraries has now shifted to train scientists to perform their own bibliometric analyses (Keller, 2015).

In the United States, libraries have developed comprehensive LibGuides for bibliometrics and measuring research impact. Often these guides are coupled with bibliometrics workshops or consultations that are integrated into existing services (see University of Maryland; Borchardt, 2016). There are also reports of specific bibliometric-related projects in the United States, such as services to support the use of metrics in tenure review (Hendrix, 2010) and using bibliometric data to compare the research outputs of academic departments (Bennett, Leonard, & Wrublewski, 2012). Some academic and research libraries in Germany and Austria have embraced a comprehensive approach to bibliometric services. The library at the Jülich Forschungszentrum offers a variety of fee-based bibliometric analyses, trend reports, and scientific research maps for both internal and external clients (Ball & Tunger, 2006; Forschungszentrum Jülich - Bibliometrie). The University of Vienna has a dedicated bibliometrics department that offers training, consultations, and expert bibliometric analyses, in addition to planning and conducting outreach and education events, such as the European Summer School for Scientometrics (Gumpenberger, Wieland, & Gorraiz, 2012). Collaboration is a key element in library bibliometric services. Libraries from three Irish universities collaborated to create the open access resource Measuring your Research Impact (MyRI) to support bibliometric training and outreach efforts (Pan, & Breen, 2011). Corrall, Kennan, & Afzal found that Irish libraries offered more bibliometric support in comparison with academic libraries in the UK, Australia, and New Zealand, and posits that this is due to the development of MyRI (Corrall, Kennan, & Afzal, 2013). In Germany, a number of institutions and libraries founded a collaborative network for bibliometric research (Kompetenzzentrum Bibliometrie). Members of university administration have been identified as potential users of bibliometric services, and opportunities for collaboration with departments such as the Office of Research or the Office of Faculty Affairs and Services exist (Corrall, Kennan, & Afzal, 2013; Bladek, 2014). These types of collaborations vary by country, and may be dependent on factors such as differences in national higher-education structures or in academic cultures (Corrall, Kennan, & Afzal, 2013). Bibliometric Services at the TUM Library The Technical University of Munich and the Excellence Initiative The Technical University of Munich (TUM) is a large research institution with approximately 40,000 students, 22 percent of whom come from other countries. Ten of the university s 13 academic departments are in the fields of science and technology. In 2015, TUM received the second highest ranking for German universities in the Academic Ranking of World Universities (Shanghai Ranking, 2015), and TUM researchers published more than 6,000 scientific articles and filed 38 patents during 2014 (Technical University of Munich, 2014/2015). The University Library consists of nine branch libraries located at four locations in the greater Munich area and Bavaria. With over 1.5 million electronic and print items, 500,000 loans, five million full-text downloads and 1.8 million library visitors each year, the University Library is the academic information center of the university. In addition to maintaining the collection, the Library s 120 staff members work to provide a comprehensive information literacy training program, a media server for electronic material, and open access publication support. In 2005, the German federal and state governments instated the nationwide Excellence Initiative, whose aim is to make Germany a more attractive research location, making it more internationally competitive and focusing attention on the outstanding achievements of German universities and the German scientific community (Deutsche Forschungsgemeinschaft). The project, initially approved for five years, was extended until 2017, during which time a total of 4.6 billion will be spent to develop graduate schools, to establish research clusters of excellence, and to draft institutional strategies for promoting top-level research (Deutsche Forschungsgemeinschaft). The Excellence Initiative has been a strong impulse for increased competition in German higher education. Efforts to demonstrate research impact, to improve universities positions in

academic rankings, and to attract top scientists are being implemented throughout the country. Eager to prove its reputation and to extend its ability to compete in the international arena, TUM developed a comprehensive series of programs. These programs focus on measurable impact and include the creation of new research fields and an enhanced recruiting and executive search process. As a result of its efforts, the university was awarded University of Excellence status in 2006 and again in 2012 (Technical University of Munich, 2016a). Need for Bibliometric Services During the first round of Excellence Initiative funding, various university departments (such as the Office of Controlling, Organization, & Planning and the Office of Faculty Recruitment) were beginning to independently work with bibliometric information. At the same time, the University Library began to receive an increased number of requests for bibliometric information from individual researchers and from university administration. At that point, the Library did not offer organized support for these types of questions, and library staff were often unsure how to handle such requests. Although numerous departments were working with similar bibliometric issues, there was no synergy to these efforts. In 2014 the university management convened a meeting with representatives of the TUM Board of Management, the university administration, and the University Librarian. As a result of this meeting, the Library was assigned the project of identifying units with bibliometric needs, streamlining existing activities, building up expertise, and developing a comprehensive portfolio of bibliometric support services. The Library was granted approval to create a new librarian position that would primarily be dedicated to the project. Implementation of Services At present, a team of four librarians provides bibliometric services at the TUM. Eighty percent of one librarian s position is dedicated to providing bibliometric services; the other three librarians spend approximately 60% of their combined work hours on the project. The team is situated within the Information Services department of the University Library. The Library s current service portfolio consists of: Training Consultation service Bibliometric analyses Collaborative work with university management Training Since 2012, a bibliometrics workshop has been part of the Library s teaching portfolio. As librarians developed the workshop, they quickly learned that the topic needed to be presented in the broader context of research visibility, strategic publishing, and research impact. The current iteration of the four-hour course, titled Visibility and Research Impact: Bibliometrics, Scholarly Communication and Publication Strategies provides a broad overview of the most common bibliometric indicators and citation databases. A discussion of academic identity management, with an emphasis on unique author identifiers (such as ORCID and ResearcherID) is included in every session. Based on the interest of the participants and time constraints, three of the following topics are also discussed: 1) academic networking; 2) current awareness strategies; 3) selecting a journal for publication; 4) altmetrics; or 5) using search engine optimization techniques to increase the visibility of research. The number of sessions and participants per year has increased steadily since the course was first offered. In 2015, for example, nearly three times as many participants attended a

Number of Participants Number of Workshops workshop as in the previous year (Figures 3 and 4; data for 2016 was estimated based on the number of workshops and participants in January-April 2016). 16 14 12 10 8 6 4 2 0 2013 2014 2015 2016 Year Figure 3: Number of Visibility and Research Impact Workshops per Year 200 180 160 140 120 100 80 60 40 20 0 2013 2014 2015 2016 Year Figure 4: Number of Workshop Participants per Year The course has also been integrated into existing training programs. It is now offered as a workshop option in the TUM Tenure Track Academy, a series of courses that new tenure-track professors take when arriving at the university. It is a required component in the orientation seminars at the TUM Graduate Schools, and has been presented (with slight modifications) at the request of numerous faculty departments to specific groups of staff and students. The Library has also broadened the audience for its training opportunities. In December 2015, library staff conducted a bibliometric training for other librarians. Additionally, the Library is planning a multi-day conference, Forum Bibliometrie, hosted by the International Association of

University Libraries (IATUL) and the University Library of TUM, for the fall of 2016 for librarians and university staff working with bibliometrics. Consultation service A consultation service for bibliometrics and impact was added to the Library s service portfolio in January 2016. Consultations provide members of the TUM community an opportunity to discuss individual questions with trained librarians. Appointments take place in two branch libraries, in an individual s workspace, or via a video-conferencing system. The majority of appointment requests are made using an online form. Requests are then routed to the Library s existing ticketing system used for monitoring and tracking reference questions. Not every question results in a personal consultation, as some questions can easily be answered via email or a short telephone conversation. The Library began receiving such questions before the formal consultation service had been advertised. Between August 2015 and April 2016, twenty-five questions were received. Forty percent of the questions were submitted by members of the university management and leadership; the remaining sixty percent came from doctoral students and scientists. Many individuals were interested in receiving a broad overview of bibliometrics and research impact. In such cases, individuals were either referred to the standard library course, or a new course specific to a particular discipline was arranged. Other questions were much more specific. Doctoral candidates, for example, arranged appointments to discuss strategies for selecting journals for publication. Scientists submitted questions about disciplinary differences in publication behavior, and university personnel consulted with librarians to develop strategies that could be used to compare the research output of young and interdisciplinary researchers. Interestingly, the Library also received questions from two TUM scientists conducting original research in the field of bibliometrics. Prior to the implementation of the consultation service, librarians did not know that such research was being undertaken at the university. Bibliometric analyses The Library also performs basic bibliometric analyses at the request of university leadership. Since implementing the service in fall 2015, many requests have focused on using bibliometric data to compare the research output and performance of a given group of scientists. In order to standardize and streamline the required research, the Library created an evaluation form. Librarians also drafted a disclaimer notice based on the main points of the Leiden Manifesto and the San Francisco Declaration on Research Assessment to detail the limitations of using bibliometric data in evaluations. The final packet that the Library produces includes the disclaimer notice, the completed evaluation form, a summary of the search strategies used, and a summary of the data for every scientist (Appendix 1). As a rule, these data are not delivered via e- mail, but rather in person in order to explain the context and limitations of the research. Other requests from university leadership have involved proofing and assisting with correcting the university affiliations of TUM scientists in citation databases, or analyzing the coauthorship patterns of TUM researchers. Collaborative Work One of the first steps in developing the service portfolio was to identify campus units that could use bibliometric information in regular workflows. The Office of Faculty Recruitment, the Office of Controlling, Organization & Planning, the Office of Research & Innovation, and the Office of Evaluation were identified as possible collaboration partners. The bibliometrics team met with each Office to discuss how to best support existing processes. Librarians and members of the Office of Faculty Recruitment (OFR) identified possible roles for the Library in two projects: the tenure track system and the executive search process. Librarians now offer targeted advising for both tenure-track professors and tenure evaluation committees regarding the use and interpretation of bibliometric data in tenure packages. The OFR advertises this service directly to professors and faculty committees involved in evaluation and recruitment.

Additionally, bibliometric training was integrated into the existing Tenure Track Academy as a result of these conversations. In the executive search process, the OFR regularly conducts proactive screenings to identify top scientists. Faculty committees also write job descriptions for specific positions. Committees are required to submit a list of top candidates to the OFR with each job description that is drafted. The OFR and the bibliometrics team decided that bibliometric analyses could help with both the screenings and with creating candidate lists. Librarians could also use the evaluation form (see Appendix 1) to provide bibliometric data for candidates invited for an interview. Although these services were ready for implementation at the time of writing, the OFR has yet to advertise them to the faculty. The Office of Controlling, Organization & Planning (OCP) ensures that the TUM is accurately represented in worldwide academic rankings and uses data to depict the state of research and education at the university (Technical University of Munich, 2016b). The bibliometrics team developed a close relationship with the OCP, and established regular meetings to discuss shared interests. As a result of these meetings, both parties worked to improve the data basis used in academic rankings. Together, they lobbied the Web of Science to consolidate the 40-plus variations of the university s name present in the database. Another problem identified was that a significant percentage of the university s research is not included in the major citation databases. Therefore, the OCP and the bibliometrics team conceptualized an online university bibliography that would include all of the university s publications. They developed a presentation arguing in favor of the creation of the bibliography, and the Library investigated options for using the institutional repository as a technical infrastructure. University leadership is expected to approve the project in the near future, at which time further workflows will be developed. The Office of Research and Innovation (ORI) is responsible for providing assistance with funding and proposal writing. In spring 2016, the ORI was also assigned the task of creating a unified research information system for the entire university. As a result of meetings with the ORI, librarians developed targeted advertising for consultations for researchers using bibliometric data in funding proposals. The development of the research information system will be a multi-year process. Nonetheless, the bibliometrics team and the ORI discussed how the Library could provide publication data to be fed into the system. The team also recommended that the system be built to accommodate the unique author identifier system ORCID, which is currently being implemented at TUM. The Office of Evaluation (OE) conducts regular faculty evaluations. All faculties at TUM undergo a multistage internal evaluation process, including a self-evaluation and an evaluation by informed international peer-reviewers. Among other criteria, the faculty self-evaluation consists of the number of total publications, the number of international as well as interdisciplinary collaborations, performance measurements of scientific output, and indicators for national and international benchmarks. Bibliometric data either forms the basis or supports all of these evaluations. Discussions to map out possible workflows are planned for spring 2016. Discussion The experience of implementing bibliometric services at the University Library provides some valuable insights. From a logistical standpoint, the librarians spent more time than anticipated on the project. Aside from requests for thematic overviews, each question that was received was unique and needed to be handled with care, as the bibliometric data often had the potential to be used to inform employment or publication decisions. Librarians also needed to dedicate large amounts of time to staying current with developments in the field. The creation of the new librarian position helped to relieve this stress, but as the demand for services grows, staffing could become a problem. As with any new offering, effective advertising played a crucial role in launching the new service. Targeted emails, information sheets, newsletter submissions, and the development of a bibliometrics webpage helped to spread the word to the community. Directly after beginning the

advertising campaign, the Library received more requests for tailored classes and consultations. As time elapsed, however, the number of requests decreased, pointing to a need to regularly market the new service. A bottom-up approach of advertising to doctoral students combined with a topdown approach of advertising to established faculty could help to further integrate the service. While the collaborative efforts of the Library are off to a strong start, there have also been challenges. Arranging meetings with the appropriate contacts often took longer than expected. Working collaboratively to approve workflows and documents also required time and tact. Now that the relationships have been established and the groundwork for integration has been laid, future work should be streamlined. Librarians plan to create a regular bibliometrics roundtable, where all collaboration partners can gather to discuss issues and challenges. This will help to cement relationships, while providing a place for addressing questions and fostering the development of new connections between the various departments. Obtaining support for the project from university leadership was a prerequisite to success. Once the Library was assigned the task of creating bibliometric services, the path was paved for collaborations with other departments. As a result of the new collaborations with both university leadership and management, awareness of library services has increased, and the Library s position as a leader in research support services has been strengthened. As the thinking about research impact and bibliometrics continues to evolve, so must the Library s services. Librarians will need to consider such topics as how best to support the shift toward documenting the societal impact of research. Developing concise visualizations of bibliometric data could be another area for growth. Such visualizations could be developed for university leadership to communicate institutional research trends, and could have the added potential of further demonstrating the value of the Library s services to decision makers in the current culture of evaluation and assessment.

Appendix 1 Bibliometric Analysis This report was prepared by the TUM University Library Bibliometrics Team. The data in this report are quantitative and should be used only in conjunction with qualitative, expert opinion. The reported values are based on the data included in the indicated databases. Comparisons of bibliometric data for researchers working in different disciplines cannot be made. Many metrics are influenced by an individual s age. Comparisons without regard to age differences should not be made. Journal metrics, such as the impact factor, do not measure the quality of an individual article or person. See: Hicks, D. et al. (2015): The Leiden Manifesto for research metrics. Nature 520 (7548), 429-431, http://www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351; The San Francisco Declaration on Research Assessment: http://www.ascb.org/dora/ Questions can be addressed to: Kathleen Gregory (kathleen.gregory@ub.tum.de, +49 89 289 28617) Dr. Birgid Schlindwein (birgid.schlindwein@ub.tum.de, +49 8161 714029) or bibliometrie@ub.tum.de Bibliometric Analysis Summary of Findings/Reliability of Data General Information Prepared by: Date of data collection Approximate age of candidate Gender Research area, according to personal website Websites Author Identifiers ORCiD ResearcherID Scopus AuthorID Google Scholar Profile Publications Year of first publication Year(s) with the most publications Number of publications on Website in WoS in Scopus in Google Scholar in other databases (if necessary) Article Metrics Number of citations WoS Scopus Google Scholar Personal Metrics h-indices WoS Scopus Google Scholar Candidate 1 Candidate 2 Candidate 3

Journal metrics Journals frequently published in and their impact factors (according to WoS) Title(# Publications)/IF/Rank of IF within discipline Other Signs of Impact Activity in academic networks ResearchGate discipline specific networks (if nececessary) Altmetrics Author order in publications Articles funded by external agencies (according to WoS) DFG-Projekts financed through other agencies Co-author affiliations (WoS, Scopus) Search Strategy Candidate 1 Web of Science Scopus Other databases Candidate 2 Candidate 3 Summary Candidate 1 Candidate 2 Candidate 3

References Altman, M. & Crosas, M. (2013). The Evolution of Data Citation: From Principles to Implementation. IASSIST Quarterly, 62 70. Australian Research Council. Research Impact Principles and Framework. Retrieved from http://www.arc.gov.au/print/11222 Ball, R., & Tunger, D. (2006). Bibliometric analysis - A new business area for information professionals in libraries?: Support for scientific research by perception and trend analysis. Scientometrics, 66(3), 561 577. doi:10.1007/s11192-006-0041-0 Bennett, D., Leonard, M., & Wrublewski, D. (2012). Comparing Engineering Departments Across Institutions: Gathering Publication Impact Data in a Short Timeframe. Issues in Science and Technology Librarianship, (68). doi:10.5062/f42r3pms Bladek, M. (2014). Bibliometrics Services and the Academic Library: Meeting the Emerging Needs of the Campus Community. College & Undergraduate Libraries, 21(3-4), 330 344. doi:10.1080/10691316.2014.929066 Borchardt, R. (2016). LibGuides: Scholarly Research Impact Metrics. Retrieved from http://subjectguides.library.american.edu/metrics Bornmann, L. (2013). What is societal impact of research and how can it be assessed?: A literature survey. Journal of the American Society for information science and technology, 64(2), 217 233. doi:10.1002/asi.22803 Bornmann, L. (2014). On the function of university rankings. Journal of the Association for Information Science and Technology, 65(2), 428 429. doi:10.1002/asi.23019 Calzada Prado, J., & Marzal, M. Á. (2013). Incorporating Data Literacy into Information Literacy Programs: Core Competencies and Contents. Libri, 63(2). doi:10.1515/libri-2013-0010 Corrall, S., Kennan, M. A., & Afzal, W. (2013). Bibliometrics and Research Data Management Services: Emerging Trends in Library Support for Research. Library Trends, 61(3), 636 674. doi:10.1353/lib.2013.0005 Deutsche Forschungsgemeinschaft (DFG). Excellence Initiative. Retrieved from http://www.dfg.de/en/research_funding/programmes/excellence_initiative/index.html Drummond, R., & Wartho, R. (2009). RIMS: The Research Impact Measurement Service at the University of New South Wales. Australian Academic & Research Libraries, 40(2), 76 87. doi:10.1080/00048623.2009.10721387 Forschungszentrum Jülich - Bibliometrie. Retrieved from http://www.fzjuelich.de/zb/de/leistungen/bibliometrie/bibliometrie_node.html Gibbs, C. & Sergeant, K. (2009). Opportunity not hard work: scripted solutions to solving our bibliometric nightmare: Paper presented at Information Online: 14th Annual Exhibition and Conference, SydneyConference and Exhibition Centre. Retrieved from http://conferences.alia.org.au/online2009/docs/presentationb6 Given, L. M., Kelly, W., & Willson, R. (2015). Bracing for Impact: The Role of Information Science in Supporting Societal Research Impact. ASIST '15 Proceedings of the 78th ASIS&T Annual Meeting: Information Science with Impact: Research in and for the Community. Retrieved from http://onlinelibrary.wiley.com/doi/10.1002/pra2.2015.145052010048/abstract Gumpenberger, C., Wieland, M., & Gorraiz, J. (2012). Bibliometric practices and activities at the University of Vienna. Library Management, 33(3), 174 183. doi:10.1108/01435121211217199 Haustein, S., & Larivière, V. (2015). The Use of Bibliometrics for Assessing Research: Possibilities, Limitations and Adverse Effects. In Welpe, I.M., Wollersheim, J., Ringelhan, S., Osterloh, M. (Ed.), Incentives and Performance. Governance of Research Organizations (pp. 121 139). Springer Berlin Heidelberg. Retrieved from http://crc.ebsi.umontreal.ca/en/publications/the-useof-bibliometrics-for-assessing-research-possibilities-limitations-and-adverse-effects/

Hendrix, D. (2010). Tenure Metrics: Bibliometric Education and Services for Academic Faculty. Medical Reference Services Quarterly, 29(2), 183 189. doi:10.1080/02763861003723416 Hicks, D., Wouters, P., Waltman, L., Rijcke, S. de, & Rafols, I. (2015). The Leiden Manifesto for research metrics. Nature, (520), 429 431. Retrieved from http://www.nature.com/news/bibliometrics-the-leiden-manifesto-for-research-metrics-1.17351 JISC Research Life Cycle. (2014). Retrieved from http://www.webarchive.org.uk/wayback/archive/20140613220103/http://www.jisc.ac.uk/whatwed o/campaigns/res3/jischelp.aspx Keller, A. (2015). Research support in Australian university libraries: An outsider view. Australian Academic & Research Libraries, 46(2), 73 85. doi:10.1080/00048623.2015.1009528 Kompetenzzentrum Bibliometrie. Retrieved from http://www.bibliometrie.info/ MacColl, J. (2010). Library Roles in University Research Assessment. LIBER Quarterly, 20(2), 152 168. doi:10.18352/lq.7984 Norris, M., Oppenheim, C., & Rowland, F. (2008). The citation advantage of open-access articles. Journal of the American Society for Information Science and Technology, 59(12), 1963 1972. doi:10.1002/asi.20898 Pan, R., & Breen, E. (2011). MyRI: An open access bibliometrics toolkit collaboration in research skills support. Retrieved from http://de.slideshare.net/infolit_group/pan-breen Piwowar, H. A., Day, R. S., Fridsma, D. B., & Ioannidis, J. (2007). Sharing Detailed Research Data Is Associated with Increased Citation Rate. PLoS ONE, 2(3), e308. doi:10.1371/journal.pone.0000308 Research Excellence Framework. (2015). Retrieved from http://www.ref.ac.uk/ San Francisco Declaration on Research Assessment: DORA. Retrieved from http://www.ascb.org/dora/ Shanghai Ranking: Academic Ranking of World Universities (ARWU). (2015). Retrieved from http://www.shanghairanking.com/world-university-rankings-2015/germany.html Technical University of Munich. (2014/2015). Facts & Figures 2014/2015. Retrieved from http://www.tum.de/en/about-tum/our-university/facts-and-figures/ Technical University of Munich. (2016a). Path to success living excellence. Retrieved from http://www.exzellenz.tum.de/en/homepage/ Technical University of Munich. (2016b). TUM in rankings. Retrieved from http://www.tum.de/en/about-tum/our-university/rankings/ University of New South Wales. (2016). Subject Guides: Research Impact Guide. Retrieved from http://subjectguides.library.unsw.edu.au/researchimpact University of Maryland. (2016). LibGuides: Bibliometrics and Altmetrics: Measuring the Impact of Knowledge. Retrieved from http://lib.guides.umd.edu/bibliometrics Van Noorden R, Brumfel G. (2010). Fixing a grant system in crisis. Nature, 464, 474 475. Retrieved from http://www.nature.com/news/2010/240310/full/464474a.html Wright, S., Fosmire, M., Jeffryes, J., Stowell Bracke, M., & and Westra, B. (2012). A Multi- Institutional Project to Develop Discipline-Specific Data Literacy Instruction for Graduate Students. Libraries Faculty and Staff Presentations, Paper 10. Retrieved from http://docs.lib.purdue.edu/lib_fspres/10 Zhao, D. (2011). Bibliometrics and LIS education: How do they fit together? Proceedings of the American Society for Information Science and Technology: ASIST, 48, 1 4. Zhao, L. (2014). Riding the Wave of Open Access: Providing Library Research Support for Scholarly Publishing Literacy. Australian Academic & Research Libraries, 45(1), 3 18. doi:10.1080/00048623.2014.882873