Mike Thelwall 1, Stefanie Haustein 2, Vincent Larivière 3, Cassidy R. Sugimoto 4

Size: px
Start display at page:

Download "Mike Thelwall 1, Stefanie Haustein 2, Vincent Larivière 3, Cassidy R. Sugimoto 4"

Transcription

1 Do altmetrics work? Twitter and ten other social web services 1 Mike Thelwall 1, Stefanie Haustein 2, Vincent Larivière 3, Cassidy R. Sugimoto 4 1 m.thelwall@wlv.ac.uk School of Technology, University of Wolverhampton, Wulfruna Street, Wolverhampton WV1 1LY (UK). 2 stefanie.haustein@umontreal.ca École de bibliothéconomie et des sciences de l information, Université de Montréal, C.P. 6128, Succ. Centre- Ville, Montréal, QC. H3C 3J7 (Canada) and Science-Metrix Inc., 1335 A avenue du Mont-Royal E, Montréal, Québec H2J 1Y6, (Canada) 3 vincent.lariviere@umontreal.ca École de bibliothéconomie et des sciences de l information, Université de Montréal, C.P. 6128, Succ. Centre- Ville, Montréal, QC. H3C 3J7 (Canada) and Observatoire des sciences et des technologies (OST), Centre interuniversitaire de recherche sur la science et la technologie (CIRST), Université du Québec à Montréal, CP 8888, Succ. Centre-Ville, Montréal, QC. H3C 3P8, (Canada) 4 sugimoto@indiana.edu School of Information and Library Science, Indiana University Bloomington 1320 E. 10th St. Bloomington, IN (USA) Altmetric measurements derived from the social web are increasingly advocated and used as early indicators of article impact and usefulness. Nevertheless, there is a lack of systematic scientific evidence that altmetrics are valid proxies of either impact or utility although a few case studies have reported medium correlations between specific altmetrics and citation rates for individual journals or fields. To fill this gap, this study compares 11 altmetrics with Web of Science citations for 76 to 208,739 PubMed articles with at least one altmetric mention in each case and up to 1,891 journals per metric. It also introduces a simple sign test to overcome biases caused by different citation and usage windows. Statistically significant associations were found between higher metric scores and higher citations for articles with positive altmetric scores in all cases with sufficient evidence (Twitter, Facebook wall posts, research highlights, blogs, mainstream media and forums) except perhaps for Google+ posts. Evidence was insufficient for LinkedIn, Pinterest, question and answer sites, and Reddit, and no conclusions should be drawn about articles with zero altmetric scores or the strength of any correlation between altmetrics and citations. Nevertheless, comparisons between citations and metric values for articles published at different times, even within the same year, can remove or reverse this association and so publishers and scientometricians should consider the effect of time when using altmetrics to rank articles. Finally, the coverage of all the altmetrics except for Twitter seems to be low and so it is not clear if they are prevalent enough to be useful in practice. Introduction Although scholars may traditionally have found relevant articles by browsing journals, attending meetings and checking correspondence with peers, in the era of digital sources 1 Thelwall, M., Haustein, S., Larivière, V. & Sugimoto, C. (in press). Do altmetrics work? Twitter and ten other candidates. PLoS ONE.

2 they may rely upon keyword searches or online browsing instead. Whilst desktop access to many digital libraries and indexes provides potential access to numerous articles, scholars sometimes need strategies to help them to identify the most relevant articles from amongst large sets. In response, Google Scholar orders search matches in approximately decreasing order of citation, presumably with the assumption that more highly cited articles are more likely to be important or useful. Digital libraries with citation indexes often offer the same service (e.g., ACM, IEEE). In addition, digital libraries typically offer options to either sort search results by date or to confine the results to a specific year. Presumably, many scholars remain current in their fields and are therefore only interested in recent articles. However, given that citations need time to accrue, they are not the best indicator of important recent work. In response, some publishers have turned to altmetrics [1, 39], which are counts of citations or mentions in specific social web services, because they can appear more rapidly than citations. For example, it would be reasonable to expect a typical article to be most tweeted on its publication day and most blogged within a month of publication. Hence, social media mentions have become a valuable marketing tool for publishers trying to promote current high impact articles and there are also a number of altmetric tracking websites that offer free and paid services (e.g., altmetric.com, impactstory.org, and sciencecard.org). The fact that citations take time to accumulate also has an impact on research evaluation, as a wait of a few years after publication is needed before the impact of papers can be measured (more in some disciplines). As a result, many have turned to Journal Impact Factors as a proxy for the potential citation value of articles within journals; however, due to the skewness of citation distributions [2], journal measures should not be used as article-level indicators [3]. Additionally, the relationship between citations and the Impact Factor is weakening [4]. Social media mentions, being available immediately after publication and even before publication in the case of preprints offer a more rapid assessment of impact. Lastly, citations only assess the impact of scholarly literature on those who cite this neglects many other audiences of scholarly literature who may read, but do not cite (see the notion of pure readers [5]-[7]). In particular, the societal impact of research may not be well addressed by citations and a range of alternative methods have been developed to assess this [40]. Since the social web is widely used outside of science, it may have the potential to inform about societal impact. The use of altmetrics in information retrieval and research evaluation begs the question: How are altmetric and citation measures related? Do social media mentions predict or correlate with subsequent citation rates for a given article? If a correlation is found, this might suggest that altmetrics and citations measure, at least to a certain extent, the same phenomenon and that altmetrics are merely early indicators of this underlying quality. The absence of such a relationship, however, would demonstrate that altmetrics probably measure something different. Given this scenario, the quality that is measured by altmetrics should be examined in order to understand the validity of using such metrics in an evaluative manner or for information retrieval. This paper contributes to this discussion by comparing eleven different altmetric sources with citation data for 182 to 135,331 (depending on the metric) PubMed documents published between 2010 and Specifically, this study seeks to answer the following research question: To what extent do the altmetric indicators associate with citation counts?

3 Background Employing non-citation-based metrics in the evaluation of research is not novel. Previous research has looked for correlations between traditional citations and their younger counterparts: online presentations [8], online syllabi [9], Google Scholar citations [10]-[12], Google Book citations [13], and article downloads [14]-[16]. Although webometric and electronic readership studies have tried to reflect scholarly impact in a broader sense, they have often been restricted by scalability of and access to data. As altmetrics focus on social media platforms that often provide free access to usage data through Web APIs, data collection is less problematic [17]. Several sources have been proposed as alternatives for measuring the impact of scholarly publications, such as mentions and citations in blogs, Wikipedia, Twitter or Facebook or reader counts on social reference managers and bookmarking platforms [1], [17]-[20]. Evaluations of these sources have focused on single genres or sources, such as Twitter [21]-[23], blogs [24]-[25], bookmarks [26], and Wikipedia [27]. Some research has focused on a variety of indicators for a single source, such as analyses of PLoS article-level metrics (ALM), which include counts of comments, ratings, social bookmarks and blog citations to articles published in the PLoS journals [3],[28]. Reader counts from social bookmarking services and social reference managers such as Mendeley, CiteULike, BibSonomy and Connotea have also been analyzed [26], [29]-[33]. A few studies have investigated altmetrics and their relationship with traditional citation indicators. Mendeley readership counts were found to correlate moderately with citations for Nature (r=0.56) and Science (r=0.54; [33]), PLoS (r=0.5; [34]), JASIST (r=0.46; [29],[30]), bibliometrics publications (r=0.45; [31]) and more strongly for articles recommended on F1000 (r=0.69; [32]). Tweets of arxiv articles (i.e., preprints of articles in mathematics, physics, astronomy, computer science, quantitative biology, quantitative finance and statistics) associate with early citation counts [23] and tweets of the Journal of the American of Medical Internet Association within the same year can predict future citation counts [22]. Although these results suggest that there is a positive relationship between tweets and citations, these correlation studies have mainly covered individual elite journals and those that favour internet research. The exception, for arxiv preprints also covers a somewhat special area of scholarship: articles from quantitative research areas promoted by their authors through self-archiving. Arguments against the value of altmetrics include the ease with which they can be manipulated and their susceptibility to skew in favour of comical or sexual titles (e.g., in February 2013 the top PLoS article (from PLOS Neglected Tropical Diseases) on altmetric.com was entitled: "An In-Depth Analysis of a Piece of Shit: Distribution of Schistosoma mansoni and Hookworm Eggs in Human Stool"). In order to obtain more robust evidence, larger scale studies are needed. Moreover, the various altmetrics have different characteristics when examined diachronically. Priem, Piwowar, and Hemminger [34] examined the distribution of social media events over time for PLoS articles, noting differences in behaviour. For example, citations, page views and Wikipedia citations tended to increase over time while CiteULike, Mendeley, Delicious bookmarks, and F1000 ratings were relatively unaffected by article ages. Other metrics contained serious flaws as changes in service and limitations of data hindered analysis this highlights concerns over the stability of some of these indicators and the use of these indicators in longitudinal studies. It seems that altmetrics probably capture a broad, or at least a different, aspect of research visibility and impact in comparison to citation counts. For example, non-publishing

4 so called pure readers are estimated to constitute one third of the scientific community [5],[6] and these may tweet or blog articles without ever citing them. Publications also influence the development of new technologies, the daily work of professionals, teaching, and also have other societal effects [35],[36], which may also be tweeted about or discussed in the social web. Kurtz and Bollen [37] classify readers of scholarly publications into four groups: researchers, practitioners, undergraduates and the interested public. Whilst all of these might use the social web, the first group is the most likely to publish scholarly papers. Finally, the database used in this article, PubMed, indexes biomedical papers from MEDLINE as well as life science journals and online books. It is owned by the U.S. National Library of Medicine. The MEDLINE journals are selected by a technical advisory committee run by the U.S. National Institutes of Health [42]. Methods The goal of the research design was to devise a fair test of whether higher altmetrics values associate with higher citation counts for articles. Previous altmetric and webometric studies have tended to correlate citations with the web metric on the assumption that since citation counts are a recognised indicator of academic impact, any other measure that correlates positively with them is also likely to associate with academic impact. Correlation tests are not ideal for altmetrics, however, because many are based upon services with a rapidly increasing uptake. In consequence, newer articles can expect, on average, to receive higher altmetric scores than older articles. Since citations also take time to accrue the opposite is true for citation counts and so without adjusting for these differences a correlation test is always biased towards negative correlations. Adjusting citation and usage windows to eliminate these biases, as done with download statistics (e.g., [14],[15],[38]), is difficult as reliable usage data is only available for recent documents for which the citation window will be too small. To avoid these issues a simple sign test was devised. For this test, each article is compared only against the two articles published immediately before and after it (within the data set used and for the same journal). Thus only articles of approximately the same age, which are similarly exposed to the same citation delay and usage uptake biases, are compared to each other. Moreover any slight advantage or disadvantage of the article published after the one tested should be cancelled out by its averaging with the equivalent advantage or disadvantage of the article published before. The test gives three possible outcomes: Success: the altmetric score is higher than the average altmetric score of the two adjacent articles and its citation score is higher than the average of the two adjacent articles OR the altmetric score is lower than the average altmetric score of the two adjacent articles and its citation score is lower than the average of the two adjacent articles. Failure: the altmetric score is higher than the average altmetric score of the two adjacent articles and its citation score is lower than the average of the two adjacent articles OR the altmetric score is lower than the average altmetric score of the two adjacent articles and its citation score is higher than the average of the two adjacent articles. Null: All other cases. Note that this includes cases where all three articles are uncited, which is likely to occur when the articles are relatively new. To illustrate the above, suppose that articles A, B, and C are ranked in publication order and attracted 2, 3, and 6 tweets respectively. Then comparing the altmetric score of B (3) with

5 the average of the other two ((2+6)/2=4) results in a prediction that B will have less citations than the average of A and C. Hence if A, B, and C get 4, 6, and 12 citations respectively, then this will count as a success (as 6 is less than (4+12)/2=8). If they get 1, 2, and 1 citations, respectively, then this will count as a failure (as 2 is greater than (1+1)/2=1). If A, B and C get 1 citation each then this would count as a null result (as 1 is not greater than or less than (1+1)/2=1). Using the above scores, the more strongly an altmetric associates with citations, the higher the ratio of success to failure should be. Conversely, if an altmetric has no association with citations then the number of successes should not be statistically significantly different from the number of failures. The altmetric data used originates from altmetric.com. This data was delivered on January 1, 2013 and includes altmetric scores gathered since July Although the system was undergoing development at the time and there may be periods of lost data, this should not cause false positive results due to the testing method used, as described above. The 11 metrics are the following. Tweets: Tweets from a licensed Twitter firehose are checked for citations. FbWalls: A licensed Facebook firehose is used for Wall posts to check for citations. RH: Research highlights are identified from Nature Publishing Group journals. Blogs: The blog (feed) citations are from a manually-curated list of about 2,200 science blogs, derived from the indexes at Nature.com Blogs, Research Blogging and ScienceSeeker. Google+: The Google+ Applications Programming Interface (API) is used to identify Google+ posts to check for citations. MSM: The mainstream media citation count is based on a manually curated list of about 60 newspapers and magazines using links in their science coverage. Reddits: Reddit.com posts from the Reddit API are checked for citations. Forums: Two forums are scraped for citations. Q&A: The Stack Exchange API and scraping of older Q&A using the open source version of Stack Exchange's code are used to get online questions and answers to check for citations. Pinners: Pinterest.com is scraped for citations. LinkedIn: LinkedIn.com posts from the LinkedIn API are checked for citations. The altmetric data is not a complete list of all articles with PubMed IDs. Instead it is a list of all articles with a PubMed ID and a non-zero altmetric.com score in at least one of the altmetrics. Citations for these articles, if any, were obtained from WoS by matching the bibliographic characteristics (authors, titles, journals, and pages) of PubMed records with WoS records. First author self-citations were excluded from the results on the basis that authors would rarely hear about their work from social media. Citations and self-citations had a Spearman correlation of for the data and so this made little difference to the results. Mentions of articles by their authors in the altmetric data were not removed because this is impractical (e.g., due to Twitter usernames not conforming to guidelines); it seems that no previous study and no altmetric web site has attempted to remove selfcitations. There were 3,676,242 citations altogether to the articles in the data set, excluding self-citations. Although the citation scores for the articles are not reliable due to the short citation windows, this should not cause systematic biases in the results because publication time is taken into account in the method used to compare citations with altmetric scores. For each journal and each altmetric, a list was created of all articles with a score of at least 1 on the altmetric, discarding articles with a zero score. The reason for the discarding

6 policy was that the data set did not include a complete list of articles in each journal and it was impractical to obtain such a list. Moreover, since the authors did not have control over the data collection process, altmetric data for articles may be missing due to problems in the data collection process (e.g., due to the matching processes used). As a consequence of this, it is not possible to be sure that articles with zero values for an altmetric should not have positive scores (unlike [43] for example). It is more certain that articles with a positive score on an altmetric had their data effectively collected with that altmetric and so data for articles with non-zero altmetric scores is the most reliable and is the only data used in this article. Since the data collection process varies between altmetrics, it is not possible to assume that a positive score for an article on one altmetric implies that it will also have been effectively monitored for all the other altmetrics. Preliminary testing showed that this was not the case (resulting in a preliminary analysis of the data with additional implied zeros for articles with a non-zero score on one altmetric but a positive score at least one other altmetric being rejected as unreliable and not reported here). The discarding policy allowed each list to be complete in the sense of including all articles with an altmetric score > 1. The results, therefore, only relate to articles attracting a positive altmetric score. To obtain the chronological order needed for the sign test, for each journal and altmetric, the document lists were ordered by PubMed ID. Although imperfect, this was the most reliable general source of chronological information available. DOIs sometimes contain chronological information, such as a year, but even when a year is present it can refer to the submission year, acceptance year or publication year. Although the publication year and issue number are included in the bibliographic metadata, they are not detailed enough and in many cases do not reflect the actual date of online availability. In contrast, the PubMed ID is more fine-grained and universal. It seems likely to be reasonably chronologically consistent for each individual journal, if not between journals. As a validity check for this, PubMed IDs were correlated with citation scores, providing a value of Crosschecking DOI-extracted years with PubMed IDs also confirmed that the use of PubMed IDs to represent time was reasonable. PubMed supplies a range of dates for articles, including Create Date, Date Completed, Date Created, Date Last Revised, Date of Electronic Publication, Date of Publication, and date added to PubMed and, of these, date of electronic publication would also be a logical choice for date ordering. Conducting the main analysis for journals separately ensures that predominantly articles from the same subject area are compared, except in the case of multidisciplinary journals. For journals with few articles in the data set any comparisons between altmetrics and citations are likely to be not statistically significant but it is still possible to test for statistical significance on the number of journals for which citations for individual articles associate positively with altmetrics more often than negatively. A simple proportion test was used for each altmetric to see whether the proportion of successes was significantly different from the default of 0.5. Null results (i.e., neither success nor failure) were ignored because these do not represent the presence of absence of an association. The proportion of null results is irrelevant because this depends to a great extent on the time since the data was collected. For instance, almost all recent data would have zero citations recorded and would hence give a null result. The number of null results therefore reveals nothing about the long term underlying relationship between an altmetric and citations. The test can occur only for journals with at least three articles in the data set and the number of tests is 2 less than the number of articles in the journal. This accounts for the differences between the number of articles and the number of tests in Table 1. The

7 number of journals differs between tables 1 and 2 because table 1 only includes journals with at least one non-null test. A Bonferroni correction for multiple tests was used to hold constant the probability of incorrectly rejecting the null hypothesis. For the p=0.05 level, this reduces the p value to and for the p=0.01 level, this reduces the p value to Results and Discussion In all cases except Google+ and Reddit and those for which under 20 articles were available to be tested (Q&A, Pinners, LinkedIn), the success rate of the altmetrics at associating with higher citation significantly exceeded the failure rate at the individual article level (Table 1). The null column of the table includes many cases of new articles with only one altmetric and no citations and therefore is potentially misleading because the articles may receive citations later and so the altmetric scores for the same articles could then become successes or failures. Overall, there are no cases where the number of failures is lower than the number of successes and so this suggests that, given sufficient data, all the altmetrics would also show a significantly higher success than failure rate. The case that runs most counter to the hypothesis that altmetrics associate with citations is Google+, which launched on June 28, 2011 and has non-significant results despite a large number of tagged articles. This may be a statistical anomaly since the ratio of successes to failures is only slightly above 50% for the metrics with significant scores (except for forums). Table 1. The number of successes and failures for comparisons of citations and metric scores for articles with non-zero metric scores. Articles are only compared against other articles from the same journal. Z Null Total Journals Articles Metric Successes Failures tests Tweets** (57%) (43%) FbWalls** 3229 (58%) 2383 (42%) RH** 3852 (56%) 3046 (44%) Blogs** 1934 (60%) 1266 (40%) Google+ 426 (53%) 378 (47%) MSM** 338 (59%) 232 (41%) Reddits 103 (56%) 81 (44%) Forums** 19 (86%) 3 (14%) Q&A 12 (67%) 6 (33%) Pinners 4 (80%) 1 (20%) LinkedIn 0 (-) 0 (-) *Ratio significantly different from 0.5 at p=0.05, **Significant at p=0.01; Bonferroni corrected for n=11. The number of journals for which the success rate of articles exceeds the failure rate (although not necessarily with a significant difference within a journal) is a majority in all cases for which there is sufficient data (Table 2) and the difference is significant for three cases. This result stays the same if the data is restricted to journals with at least 10 tested articles. In summary, there is clear evidence that three altmetrics (tweets, FbWalls, blogs) tend to associate with citations at the level of individual journals. Although for almost all

8 metrics there are some journals for which the sign test produces more failures than successes, these tend to happen for journals with few articles tested and hence the majority failure could be a statistical artefact (i.e., due to normal random variations in the data). For instance, the 25 journals with the most tweeted articles all give more successes than failures. For tweets, the journal with the most articles and more failures than successes is the 26 th, Proceedings of the Royal Society B (biological sciences), with 117 prediction successes and 118 failures. This difference of 1 is easily accounted for by normal random factors in the data. In contrast, the most tweeted journal, Proceedings of the National Academy of Sciences had 1069 successes and 818 failures (57% and 43%, respectively, of articles that were either success or failures), a small but significant difference. Note that the magnitude of the difference between success and failure in Table 2 is not helpful to interpret because this is primarily dependent upon the proportion of journals with few articles represented for which the chance of success or failure is nearly 50%. Similarly, the magnitude of the differences between the success and fail rates in both tables 1 and 2 are not significant due to the simple tests used, and the magnitude of the correlation in Table 3 is misleading due to the conflicting (assumed) citation association and negative time association and so the results do not shed any light on the magnitude of the association between citations and altmetric scores in the cases where an association is proven. The problem of non-significant differences between success rates and failure rates for individual journals could be avoided in Table 2, in theory, by replacing the figures in the second and third columns with the number of journals for which the difference between the number of successes and failures is statistically significant. This is not possible, however, because too few journals have enough articles tested to give a reasonable chance of a statistically significant result. Nevertheless, the results are consistent with but do not prove the hypothesis that all the altmetrics tested associate with higher citations. Although the results are clear for most metrics, they only cover articles with a nonzero altmetric score. It is theoretically possible, but does not seem probable, that the same is not true for all articles. For the omission of articles with zero altmetric scores to bias the results towards sign test failures, articles with zero altmetric scores would need to be more cited than average for articles published at the same time that had a positive altmetric score. This seems unlikely since the results here show that increased altmetric scores tend to associate with increased citations. Another limitation is that the results are only for PubMed articles and so it is not clear whether they would also apply outside the biomedical and life sciences. The differing sample sizes for the altmetrics is also important because altmetric-citations associations may well be significant for most of the altmetric but hidden by insufficient data. Finally, unlike one previous study [22], no predictive power can be claimed from the results. Although it seems likely that most altmetric values precede citations - for example, tweets seem to appear shortly after an article has been posted online [23] - this has not been tested here because the data does not include origin dates for the scores. In other words, we did not directly test that high altmetric scores today make high citations tomorrow more likely. Related to the issue of predictive power, it is clear from Table 1 that, other than tweets, the other metrics had a high proportion of zero scores. For instance there were only 20% as many Research Highlights articles as tweeted articles and only 0.04% as many articles in LinkedIn as tweeted articles. These figures are only estimates because there may be missing data and other data collection methods may have been able to identify more matches in all cases (including for Tweets). Nevertheless, the disparities in numbers of

9 articles in Table 1 highlight that the coverage of the altmetrics, and particularly those other than Twitter, may be low. A low coverage in combination with statistically significant results for an altmetric suggests that it is not useful to differentiate between average articles but may only be useful for identifying either exception articles or a sample of above average articles. Table 2. Successes and failures for articles with non-zero metric scores, aggregated by journal, and only including journals for which there it is at least one success or failure. Mostly Mostly Z Metric+ success failure Equal Journals Tweets** 1097 (58%) 646 (34%) (8%) 1891 ** 1032 (59%) 586 (33%) (8%) 1757 FbWalls** 414 (53%) 282 (36%) (11%) 782 ** 308 (55%) 188 (34%) (11%) 558 RH 276 (51%) 221 (41%) (9%) (51%) 157 (41%) (8%) 380 Blogs** 190 (58%) 104 (32%) (10%) 326 ** 129 (57%) 70 (31%) (12%) 225 Google+ 61 (50%) 53 (44%) (6%) (48%) 24 (46%) (6%) 52 MSM 29 (56%) 17 (33%) (12%) (52%) 9 (36%) (12%) 25 Reddits 22 (51%) 17 (40%) (9%) 43 9 (47%) 7 (37%) (16%) 19 Forums 5 (83%) 1 (17%) (0%) 6 3 (100%) 0 (0%) (0%) 3 Q&A 4 (67%) 1 (17%) (17%) 6 2 (67%) 0 (0%) (33%) 3 Pinners 2 (67%) 1 (33%) (0%) 3 0 (-%) 0 (-%) - 0 (-%) 0 LinkedIn 0 (-%) 0 (-%) - 0 (-%) 0 0 (-%) 0 (-%) - 0 (-%) 0 + In each cell the upper figure is for all journals and the lower figure is for journals with at least 10 articles tested. * Ratio of successes to failures significantly different from 0.5 at p=0.05, ** Significant at p=0.01; both Bonferroni corrected for n=11. Correlation tests were run on the data to test the importance of time for identifying significant associations between altmetrics and citations. Whilst four of the altmetrics significantly and positively correlate with citations (with a medium correlation effect size for RH, small for blogs, smaller for MSM and FBWalls [40]), the correlation for Twitter is significant and negative (with a small effect size [40], Table 3). The reason seems to be that Twitter use is increasing much faster than the others, so that more recent articles are more tweeted but are typically uncited. In other words, this reflects the two biases of correlation coefficients, described above, that are caused by the level of social media uptake on the one hand and that of citation delay on the other. To test this we ran another correlation test for Twitter based on articles from 2010 based upon their DOI (i.e., a very approximate heuristic

10 since this could be the submission date, the acceptance date, the online first date or the final publication date), finding a small significantly negative correlation of A partial correlation to remove the influence of time through PubMed IDs (again a heuristic, especially because of it being used across multiple journals that may have different PubMed submission strategies) improved this to an almost zero correlation of 0.009, tending to confirm the importance of time. An implication of these results for publishers and digital library users, is that time from publication should be considered in addition to altmetric scores when using altmetrics to rank search results. The negative correlation from Tweets in Table 3 should not be interpreted as evidence that high tweet counts do not associate with high quality articles. On the contrary, the evidence from tables 1 and 2 is that tweets are useful at indicating more highly cited articles; the negative correlation in Table 3 is due to tweets for uncited articles that, if the trend continues, will tend to become more highly cited over time. Note that these correlations are not reliable because they include articles from multiple journals with different citation rates, with different PubMed submission times and strategies, and that are associated with fields that presumably have different cultures of Twitter use. Table 3. Correlations between metric values and citations (excluding self-citations) for all articles with non-zero scores on each altmetric. Metric Spearman Articles (>0) Metric total Tweets ** 135, ,176 FbWalls 0.050** 24,822 35,317 RH 0.373** 23,980 35,365 Blogs 0.201** 13,325 17,699 Google ** 3,440 5,531 MSM 0.088** 2,402 3,209 Reddits 0.062** 1,516 1,766 Forums 0.033** Q&A 0.048** Pinners 0.005** LinkedIn 0.009** * Significant at p=0.05, ** Significant at p=0.01; both Bonferroni corrected for n=11. The correlations in Table 3 also confirm that the magnitude of the significant results in Tables 1 and 2 do not give evidence of the likely size of the underlying correlation between the altmetrics and citations. For example, there are positive associations for Twitter in Tables 1 and 2 and a negative correlation in Table 3. Hence it is not possible to speculate about the degree of accuracy for citation estimates made with altmetrics from the data set used here. Conclusions The results provide strong evidence that six of the eleven altmetrics (tweets, Facebook wall posts, research highlights, blog mentions, mainstream media mentions and forum posts) associate with citation counts, at least in medical and biological sciences and for articles with at least one altmetric mention, but the methods used do not shed light on the magnitude of any correlation between the altmetrics and citations (i.e., the correlation

11 effect size is unknown). Nevertheless, the coverage of all of the altmetrics, except possibly Twitter, is low (below 20% in all cases and possibly substantially below 20%) and so these altmetrics may only be useful to identify the occasional exceptional or above average article rather than as universal sources of evidence. The evidence also suggests that Google+ posts might possibly have little or no association with citations, and too little data was available to be confident about whether four of the metrics (LinkedIn, pinners, questions, and reddits) associate with citation counts. Nevertheless, given the positive results for the majority of metrics it would be reasonable to suppose that all may associate with citations and that if more data could be collected then this would be evident. In this case, a social web service would still need to be sufficiently used for citations to give enough data to be worth reporting or analysing (e.g., possibly not for LinkedIn, Pinners and Q&A). These results extend the previously published evidence of a relationship between altmetrics and citations for arxiv preprints and a few individual journals and two social web altmetrics (Mendeley and Twitter) to tests of up to 1,891 biomedical and life sciences journals and 11 altmetrics (6 with positive results). This study also introduced a simple method, the sign test, to eliminate biases caused by citation delays and the increasing uptake of social media platforms. Another important finding is that because of the increasing use of the social web, and Twitter in particular, publishers should consider ranking or displaying results in such a way that older articles are compensated for lower altmetric scores due to the lower social web use when they were published. Without this, more recent articles with the same eventual impact as older articles will tend to have much higher altmetric scores. In practice, this may not be a significant worry, however, because those searching the academic literature may prefer to find more recent articles. Although the results above suggest that altmetrics are related to citation counts, they might be able to capture the influence of scholarly publications on a wider and different section of their readership than citation counts, which reflect only the behaviour of publishing authors. However, more research quantitative and qualitative is needed to identify who publishes citations to academic articles in social web sites used to generate altmetrics (e.g., students, researchers, the general public), and why they publish them. Results in terms of user groups, users motives and level of effort are likely to vary between social media platforms, which must be taken into consideration when applying different altmetrics in research evaluation and information retrieval. Acknowledgements The authors would like to thank Euan Adie of Altmetric.com for supplying the data and descriptions of it. Some of his words are used above to describe the metrics. References 1. Priem J, Hemminger BM (2010) Scientometrics 2.0: Toward new metrics of scholarly impact on the social web. First Monday 15. Available: Accessed 7 December Seglen PO (1992) The skewness of science. Journal of the American Society for Information Science 43: Neylon C Wu S (2009) Article-level metrics and the evolution of scientific impact. PLoS Biology 7: e

12 4. Lozano GA, Larivière V, Gingras Y (2012) The weakening relationship between the Impact Factor and papers citations in the digital age. Journal of the American Society for Information Science and Technology 63: de Solla Price DJ, Gürsey S (1976) Studies in Scientometrics I Transience and continuance in scientific authorship. International Forum on Information and Documentation 1: Tenopir C, King DW (2000) Towards electronic journals: Realities for scientists, librarians, and publishers. Washington, DC: Special Libraries Association. 488 p. 7. Haustein S (2012) Readership metrics. In: Cronin B, Sugimoto C, editors. Beyond Bibliometrics: Harnessing Multi-dimensional Indicators of Performance. Cambridge, MA: MIT Press, in press. 8. Thelwall M, Kousha K (2008) Online presentations as a source of scientific impact?: An analysis of PowerPoint files citing academic journals. Journal of the American Society for Information Science and Technology 59: Kousha K, Thelwall M (2008) Assessing the impact of disciplinary research on teaching: An automatic analysis of online syllabuses. Journal of the American Society for Information Science and Technology 59: Delgado-López-Cózar E, Cabezas-Clavijo Á (2012) Google scholar metrics: An unreliable tool for assessing scientific journals. El Profesional De La Información 21: Kousha K, Thelwall M (2007) Google scholar citations and Google Web/URL citations: A multi-discipline exploratory analysis. Journal of the American Society for Information Science and Technology 58: Meho LI, Yang K (2007) Impact of data sources on citation counts and rankings of LIS faculty: Web of science vs Scopus and Google scholar. Journal of the American Society for Information Science and Technology 58: Kousha K, Thelwall M (2009) Google book search: Citation analysis for social science and the humanities. Journal of the American Society for Information Science and Technology 60: Brody T, Harnad S, Carr L (2006) Earlier web usage statistics as predictors of later citation impact. Journal of the American Society for Information Science and Technology 57: Moed HF (2005) Statistical relationships between downloads and citations at the level of individual documents within a single journal. Journal of the American Society for Information Science & Technology, 56: Pinkowitz L (2002) Research dissemination and impact: Evidence from web site downloads. Journal of Finance 57: Priem J (2013) Altmetrics. In: Cronin B, Sugimoto C, editors. Bibliometrics and Beyond: Metrics-Based Evaluation of Scholarly Research, Cambridge: MIT Press, in press. 18. Bar-Ilan J, Shema, H, Thelwall M (2013) Bibliographic References in Web 2.0. In Cronin B, Sugimoto C, editors. Bibliometrics and Beyond: Metrics-Based Evaluation of Scholarly Research. Cambridge: MIT Press, in press. 19. Priem J, Groth P, Taraborelli D (2012) The Altmetrics Collection. PLoS ONE 7: e Taraborelli D (2008) Soft peer review: social software and distributed scientific evaluation. In: Proceedings of the 8th International Conference on the Design of Cooperative Systems. pp

13 21. Desai T, Shariff A, Shariff A, Kats M, Fang X, Christiano C, Ferris M (2012) Tweeting the Meeting: An In-Depth Analysis of Twitter Activity at Kidney Week PLoS ONE 7: e Eysenbach G (2011) Can tweets predict citations? Metrics of social impact based on twitter and correlation with traditional metrics of scientific impact. Journal of Medical Internet Research 13: e Shuai X, Pepe A, Bollen J (2012) How the scientific community reacts to newly submitted preprints: Article downloads, twitter mentions, and citations. PLoS ONE 7: e Groth P, Gurney T (2010) Studying Scientific Discourse on the Web using Bibliometrics: A Chemistry Blogging Case Study. In: Proceedings of the WebSci10, Raleigh, NC, US. Available: Accessed 18 February Shema H, Bar-Ilan J, Thelwall M (2012) Research Blogs and the Discussion of Scholarly Information. PLoS ONE 7 (5) e Haustein S, Siebenlist T (2011) Applying social bookmarking data to evaluate journal usage. Journal of Informetrics 5: Nielsen F (2007) Scientific citations in Wikipedia. First Monday 12. Available: Accessed 21 January, Yan K-K, Gerstein M (2011) The spread of scientific information: Insights from the Web usage statistics in PLoS Article-Level Metrics. PLoS ONE 6: e Bar-Ilan J (2012a) JASIST@mendeley. Presented at the ACM Web Science Conference Workshop on Altmetrics Evanston, IL. Available: Accessed 21 January Bar-Ilan J (2012b) JASIST Bulletin of the American Society for Information Science and Technology 38: Bar-Ilan J, Haustein S, Peters I, Priem J, Shema H, Terliesner J (2012) Beyond citations: Scholars visibility on the social Web. In: Proceedings of the 17th International Conference on Science and Technology Indicators. Montréal, Canada. pp Li X, Thelwall M (2012) F1000, Mendeley and traditional bibliometric indicators. In: Proceedings of the 17th International Conference on Science and Technology Indicators. Montréal, Canada. pp Li X, Thelwall M, Giustini D (2012) Validating online reference managers for scholarly impact measurement. Scientometrics 91: Priem J, Piwowar HA, Hemminger BM (2012) Altmetrics in the wild: Using social media to explore scholarly impact. ArXiv.org. Available: Accessed 21 January, Schlögl C, Stock WG (2004) Impact and relevance of LIS journals: A scientometric analysis of international and German-language LIS journals Citation analysis versus reader survey. Journal of the American Society for Information Science and Technology 55: Rowlands I, Nicholas D (2007) The missing link: Journal usage metrics. ASLIB Proceedings 59: Kurtz M, Bollen J (2010) Usage bibliometrics. Annual Review of Information Science and Technology 44: 1-64.

14 38. Wan JK, Hua PH, Rousseau R, Sun XK (2010) The journal download immediacy index (DII): experiences using a Chinese full-text database. Scientometrics 82: Adie E, Roe, W. (2013) Altmetric: enriching scholarly content with article-level discussion and metrics. Learned Publishing 26: Available: on_and_metrics/ Accessed 19 February, Bornmann L (2013) What is societal impact of research and how can it be assessed? A literature survey. Journal of the American Society of Information Science and Technology 64: Cohen J (1988) Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Erlbaum Associates, Publishers. 42. NLM (2013) MEDLINE Fact sheet. Available: Accessed 20 March, Waltman L, Costas R (2013) F1000 recommendations as a new data source for research evaluation: A comparison with citations. ArXiv. Available: Accessed 4 April, 2013.

Measuring Research Impact of Library and Information Science Journals: Citation verses Altmetrics

Measuring Research Impact of Library and Information Science Journals: Citation verses Altmetrics Submitted on: 03.08.2017 Measuring Research Impact of Library and Information Science Journals: Citation verses Altmetrics Ifeanyi J Ezema Nnamdi Azikiwe Library University of Nigeria, Nsukka, Nigeria

More information

How well developed are altmetrics? A cross-disciplinary analysis of the presence of alternative metrics in scientific publications 1

How well developed are altmetrics? A cross-disciplinary analysis of the presence of alternative metrics in scientific publications 1 How well developed are altmetrics? A cross-disciplinary analysis of the presence of alternative metrics in scientific publications 1 Zohreh Zahedi 1, Rodrigo Costas 2 and Paul Wouters 3 1 z.zahedi.2@ cwts.leidenuniv.nl,

More information

Early Mendeley readers correlate with later citation counts 1

Early Mendeley readers correlate with later citation counts 1 1 Early Mendeley readers correlate with later citation counts 1 Mike Thelwall, University of Wolverhampton, UK. Counts of the number of readers registered in the social reference manager Mendeley have

More information

How quickly do publications get read? The evolution of Mendeley reader counts for new articles 1

How quickly do publications get read? The evolution of Mendeley reader counts for new articles 1 How quickly do publications get read? The evolution of Mendeley reader counts for new articles 1 Nabeil Maflahi, Mike Thelwall Within science, citation counts are widely used to estimate research impact

More information

The 2016 Altmetrics Workshop (Bucharest, 27 September, 2016) Moving beyond counts: integrating context

The 2016 Altmetrics Workshop (Bucharest, 27 September, 2016) Moving beyond counts: integrating context The 2016 Altmetrics Workshop (Bucharest, 27 September, 2016) Moving beyond counts: integrating context On the relationships between bibliometric and altmetric indicators: the effect of discipline and density

More information

Readership Count and Its Association with Citation: A Case Study of Mendeley Reference Manager Software

Readership Count and Its Association with Citation: A Case Study of Mendeley Reference Manager Software University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln 2018 Readership Count and Its Association

More information

Altmetric and Bibliometric Scores: Does Open Access Matter?

Altmetric and Bibliometric Scores: Does Open Access Matter? Qualitative and Quantitative Methods in Libraries (QQML) 5: 451-460, 2016 Altmetric and Bibliometric Scores: Does Open Access Matter? Lovela Machala Poplašen 1 and Ivana Hebrang Grgić 2 1 School of Public

More information

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL Georgia Southern University Digital Commons@Georgia Southern SoTL Commons Conference SoTL Commons Conference Mar 26th, 2:00 PM - 2:45 PM Using Bibliometric Analyses for Evaluating Leading Journals and

More information

Does Microsoft Academic Find Early Citations? 1

Does Microsoft Academic Find Early Citations? 1 1 Does Microsoft Academic Find Early Citations? 1 Mike Thelwall, Statistical Cybermetrics Research Group, University of Wolverhampton, UK. m.thelwall@wlv.ac.uk This article investigates whether Microsoft

More information

Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus

Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus Comparing Bibliometric Statistics Obtained from the Web of Science and Scopus Éric Archambault Science-Metrix, 1335A avenue du Mont-Royal E., Montréal, Québec, H2J 1Y6, Canada and Observatoire des sciences

More information

New data, new possibilities: Exploring the insides of Altmetric.com

New data, new possibilities: Exploring the insides of Altmetric.com New data, new possibilities: Exploring the insides of Altmetric.com Nicolás Robinson-García 1, Daniel Torres-Salinas 2, Zohreh Zahedi 3 and Rodrigo Costas 3 1 EC3: Evaluación de la Ciencia y de la Comunicación

More information

STI 2018 Conference Proceedings

STI 2018 Conference Proceedings STI 2018 Conference Proceedings Proceedings of the 23rd International Conference on Science and Technology Indicators All papers published in this conference proceedings have been peer reviewed through

More information

F1000 recommendations as a new data source for research evaluation: A comparison with citations

F1000 recommendations as a new data source for research evaluation: A comparison with citations F1000 recommendations as a new data source for research evaluation: A comparison with citations Ludo Waltman and Rodrigo Costas Paper number CWTS Working Paper Series CWTS-WP-2013-003 Publication date

More information

Scientometrics & Altmetrics

Scientometrics & Altmetrics www.know- center.at Scientometrics & Altmetrics Dr. Peter Kraker VU Science 2.0, 20.11.2014 funded within the Austrian Competence Center Programme Why Metrics? 2 One of the diseases of this age is the

More information

Who Publishes, Reads, and Cites Papers? An Analysis of Country Information

Who Publishes, Reads, and Cites Papers? An Analysis of Country Information Who Publishes, Reads, and Cites Papers? An Analysis of Country Information Robin Haunschild 1, Moritz Stefaner 2, and Lutz Bornmann 3 1 R.Haunschild@fkf.mpg.de Max Planck Institute for Solid State Research,

More information

Citation for the original published paper (version of record):

Citation for the original published paper (version of record): http://www.diva-portal.org Postprint This is the accepted version of a paper published in Scientometrics. This paper has been peer-reviewed but does not include the final publisher proof-corrections or

More information

Citation Indexes and Bibliometrics. Giovanni Colavizza

Citation Indexes and Bibliometrics. Giovanni Colavizza Citation Indexes and Bibliometrics Giovanni Colavizza The long story short Early XXth century: quantitative library collection management 1945: Vannevar Bush in the essay As we may think proposes the memex

More information

Your research footprint:

Your research footprint: Your research footprint: tracking and enhancing scholarly impact Presenters: Marié Roux and Pieter du Plessis Authors: Lucia Schoombee (April 2014) and Marié Theron (March 2015) Outline Introduction Citations

More information

Traditional Citation Indexes and Alternative Metrics of Readership

Traditional Citation Indexes and Alternative Metrics of Readership International Journal of Information Science and Management Vol. 16, No. 2, 2018, 61-78 Traditional Citation Indexes and Alternative Metrics of Readership Nosrat Riahinia Prof. of Knowledge and Information

More information

ResearchGate vs. Google Scholar: Which finds more early citations? 1

ResearchGate vs. Google Scholar: Which finds more early citations? 1 ResearchGate vs. Google Scholar: Which finds more early citations? 1 Mike Thelwall, Kayvan Kousha Statistical Cybermetrics Research Group, University of Wolverhampton, UK. ResearchGate has launched its

More information

Demystifying Citation Metrics. Michael Ladisch Pacific Libraries

Demystifying Citation Metrics. Michael Ladisch Pacific Libraries Demystifying Citation Metrics Michael Ladisch Pacific Libraries Citation h Index Journal Count Impact Factor Outline Use and Misuse of Bibliometrics Databases for Citation Analysis Web of Science Scopus

More information

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS

MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS MEASURING EMERGING SCIENTIFIC IMPACT AND CURRENT RESEARCH TRENDS: A COMPARISON OF ALTMETRIC AND HOT PAPERS INDICATORS DR. EVANGELIA A.E.C. LIPITAKIS evangelia.lipitakis@thomsonreuters.com BIBLIOMETRIE2014

More information

The Decline in the Concentration of Citations,

The Decline in the Concentration of Citations, asi6003_0312_21011.tex 16/12/2008 17: 34 Page 1 AQ5 The Decline in the Concentration of Citations, 1900 2007 Vincent Larivière and Yves Gingras Observatoire des sciences et des technologies (OST), Centre

More information

Usage versus citation indicators

Usage versus citation indicators Usage versus citation indicators Christian Schloegl * & Juan Gorraiz ** * christian.schloegl@uni graz.at University of Graz, Institute of Information Science and Information Systems, Universitaetsstr.

More information

On the differences between citations and altmetrics: An investigation of factors driving altmetrics vs. citations for Finnish articles 1

On the differences between citations and altmetrics: An investigation of factors driving altmetrics vs. citations for Finnish articles 1 On the differences between citations and altmetrics: An investigation of factors driving altmetrics vs. citations for Finnish articles 1 Fereshteh Didegah (Corresponding author) 1, Timothy D. Bowman, &

More information

On the relationship between interdisciplinarity and scientific impact

On the relationship between interdisciplinarity and scientific impact On the relationship between interdisciplinarity and scientific impact Vincent Larivière and Yves Gingras Observatoire des sciences et des technologies (OST) Centre interuniversitaire de recherche sur la

More information

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by Project outline 1. Dissertation advisors endorsing the proposal Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by Tove Faber Frandsen. The present research

More information

Long-Term Variations in the Aging of Scientific Literature: From Exponential Growth to Steady-State Science ( )

Long-Term Variations in the Aging of Scientific Literature: From Exponential Growth to Steady-State Science ( ) Long-Term Variations in the Aging of Scientific Literature: From Exponential Growth to Steady-State Science (1900 2004) Vincent Larivière Observatoire des sciences et des technologies (OST), Centre interuniversitaire

More information

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly Embedding Librarians into the STEM Publication Process Anne Rauh and Linda Galloway Introduction Scientists and librarians both recognize the importance of peer-reviewed scholarly literature to increase

More information

Comparison of downloads, citations and readership data for two information systems journals

Comparison of downloads, citations and readership data for two information systems journals Comparison of downloads, citations and readership data for two information systems journals Christian Schlögl 1, Juan Gorraiz 2, Christian Gumpenberger 2, Kris Jack 3 and Peter Kraker 4 1 christian.schloegl@uni-graz.at

More information

An Introduction to Bibliometrics Ciarán Quinn

An Introduction to Bibliometrics Ciarán Quinn An Introduction to Bibliometrics Ciarán Quinn What are Bibliometrics? What are Altmetrics? Why are they important? How can you measure? What are the metrics? What resources are available to you? Subscribed

More information

Mendeley readership as a filtering tool to identify highly cited publications 1

Mendeley readership as a filtering tool to identify highly cited publications 1 Mendeley readership as a filtering tool to identify highly cited publications 1 Zohreh Zahedi, Rodrigo Costas and Paul Wouters z.zahedi.2@cwts.leidenuniv.nl; rcostas@cwts.leidenuniv.nl; p.f.wouters@cwts.leidenuniv.nl

More information

Long-term variations in the aging of scientific literature: from exponential growth to steady-state science ( )

Long-term variations in the aging of scientific literature: from exponential growth to steady-state science ( ) Long-term variations in the aging of scientific literature: from exponential growth to steady-state science (1900 2004) Vincent Larivière Observatoire des sciences et des technologies (OST), Centre interuniversitaire

More information

Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison

Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison Alberto Martín-Martín 1, Enrique Orduna-Malea 2, Emilio Delgado López-Cózar 1 Version 0.5

More information

Measuring Your Research Impact: Citation and Altmetrics Tools

Measuring Your Research Impact: Citation and Altmetrics Tools Measuring Your Research Impact: Citation and Altmetrics Tools Guide Information Last Updated: Guide URL: Description: Tags: RSS: Apr 10, 2014 http://uri.libguides.com/researchimpact Overview of tools that

More information

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014 BIBLIOMETRIC REPORT Bibliometric analysis of Mälardalen University Final Report - updated April 28 th, 2014 Bibliometric analysis of Mälardalen University Report for Mälardalen University Per Nyström PhD,

More information

Appendix: The ACUMEN Portfolio

Appendix: The ACUMEN Portfolio Appendix: The ACUMEN Portfolio In preparation to filling out the portfolio have a full publication list and CV beside you, find out how many of your publications are included in Google Scholar, Web of

More information

arxiv: v1 [cs.dl] 8 Oct 2014

arxiv: v1 [cs.dl] 8 Oct 2014 Rise of the Rest: The Growing Impact of Non-Elite Journals Anurag Acharya, Alex Verstak, Helder Suzuki, Sean Henderson, Mikhail Iakhiaev, Cliff Chiung Yu Lin, Namit Shetty arxiv:141217v1 [cs.dl] 8 Oct

More information

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore? June 2018 FAQs Contents 1. About CiteScore and its derivative metrics 4 1.1 What is CiteScore? 5 1.2 Why don t you include articles-in-press in CiteScore? 5 1.3 Why don t you include abstracts in CiteScore?

More information

STI 2018 Conference Proceedings

STI 2018 Conference Proceedings STI 2018 Conference Proceedings Proceedings of the 23rd International Conference on Science and Technology Indicators All papers published in this conference proceedings have been peer reviewed through

More information

Research Evaluation Metrics. Gali Halevi, MLS, PhD Chief Director Mount Sinai Health System Libraries Assistant Professor Department of Medicine

Research Evaluation Metrics. Gali Halevi, MLS, PhD Chief Director Mount Sinai Health System Libraries Assistant Professor Department of Medicine Research Evaluation Metrics Gali Halevi, MLS, PhD Chief Director Mount Sinai Health System Libraries Assistant Professor Department of Medicine Impact Factor (IF) = a measure of the frequency with which

More information

UNDERSTANDING JOURNAL METRICS

UNDERSTANDING JOURNAL METRICS UNDERSTANDING JOURNAL METRICS How Editors Can Use Analytics to Support Journal Strategy Angela Richardson Marianne Kerr Wolters Kluwer Health TOPICS FOR TODAY S DISCUSSION Journal, Article & Author Level

More information

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

Discussing some basic critique on Journal Impact Factors: revision of earlier comments Scientometrics (2012) 92:443 455 DOI 107/s11192-012-0677-x Discussing some basic critique on Journal Impact Factors: revision of earlier comments Thed van Leeuwen Received: 1 February 2012 / Published

More information

DOI

DOI Altmetrics: new indicators for scientific communication in Web 2.0 Daniel Torres-Salinas is a Research Management Specialist in the Evaluation of Science and Scientific Communication Group in the Centre

More information

Readership data and Research Impact

Readership data and Research Impact Readership data and Research Impact Ehsan Mohammadi 1, Mike Thelwall 2 1 School of Library and Information Science, University of South Carolina, Columbia, South Carolina, United States of America 2 Statistical

More information

Do Mendeley Reader Counts Indicate the Value of Arts and Humanities Research? 1

Do Mendeley Reader Counts Indicate the Value of Arts and Humanities Research? 1 Do Mendeley Reader Counts Indicate the Value of Arts and Humanities Research? 1 Mike Thelwall, University of Wolverhampton, UK Abstract Mendeley reader counts are a good source of early impact evidence

More information

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir

SCOPUS : BEST PRACTICES. Presented by Ozge Sertdemir SCOPUS : BEST PRACTICES Presented by Ozge Sertdemir o.sertdemir@elsevier.com AGENDA o Scopus content o Why Use Scopus? o Who uses Scopus? 3 Facts and Figures - The largest abstract and citation database

More information

Dimensions: A Competitor to Scopus and the Web of Science? 1. Introduction. Mike Thelwall, University of Wolverhampton, UK.

Dimensions: A Competitor to Scopus and the Web of Science? 1. Introduction. Mike Thelwall, University of Wolverhampton, UK. 1 Dimensions: A Competitor to Scopus and the Web of Science? Mike Thelwall, University of Wolverhampton, UK. Dimensions is a partly free scholarly database launched by Digital Science in January 2018.

More information

Keywords: Publications, Citation Impact, Scholarly Productivity, Scopus, Web of Science, Iran.

Keywords: Publications, Citation Impact, Scholarly Productivity, Scopus, Web of Science, Iran. International Journal of Information Science and Management A Comparison of Web of Science and Scopus for Iranian Publications and Citation Impact M. A. Erfanmanesh, Ph.D. University of Malaya, Malaysia

More information

Guest Editorial: Social media metrics in scholarly communication

Guest Editorial: Social media metrics in scholarly communication Guest Editorial: Social media metrics in scholarly communication Stefanie Haustein *,1, Cassidy R. Sugimoto 2 & Vincent Larivière 1,3 * stefanie.haustein@umontreal.ca 1 École de bibliothéconomie et des

More information

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014 THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014 Agenda Academic Research Performance Evaluation & Bibliometric Analysis

More information

Bibliometric evaluation and international benchmarking of the UK s physics research

Bibliometric evaluation and international benchmarking of the UK s physics research An Institute of Physics report January 2012 Bibliometric evaluation and international benchmarking of the UK s physics research Summary report prepared for the Institute of Physics by Evidence, Thomson

More information

EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS

EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS Ms. Kara J. Gust, Michigan State University, gustk@msu.edu ABSTRACT Throughout the course of scholarly communication,

More information

Citation Metrics. BJKines-NJBAS Volume-6, Dec

Citation Metrics. BJKines-NJBAS Volume-6, Dec Citation Metrics Author: Dr Chinmay Shah, Associate Professor, Department of Physiology, Government Medical College, Bhavnagar Introduction: There are two broad approaches in evaluating research and researchers:

More information

Measuring the Impact of Electronic Publishing on Citation Indicators of Education Journals

Measuring the Impact of Electronic Publishing on Citation Indicators of Education Journals Libri, 2004, vol. 54, pp. 221 227 Printed in Germany All rights reserved Copyright Saur 2004 Libri ISSN 0024-2667 Measuring the Impact of Electronic Publishing on Citation Indicators of Education Journals

More information

Tranformation of Scholarly Publishing in the Digital Era: Scholars Point of View

Tranformation of Scholarly Publishing in the Digital Era: Scholars Point of View Original scientific paper Tranformation of Scholarly Publishing in the Digital Era: Scholars Point of View Summary Radovan Vrana Department of Information Sciences, Faculty of Humanities and Social Sciences,

More information

THE KISS OF DEATH? THE EFFECT OF BEING CITED IN A REVIEW ON

THE KISS OF DEATH? THE EFFECT OF BEING CITED IN A REVIEW ON THE KISS OF DEATH? THE EFFECT OF BEING CITED IN A REVIEW ON SUBSEQUENT CITATIONS Christian Lachance 1, Steve Poirier 2 and Vincent Larivière 1,3 1 École de bibliothéconomie et des sciences de l'information,

More information

The Financial Counseling and Planning Indexing Project: Establishing a Correlation Between Indexing, Total Citations, and Library Holdings

The Financial Counseling and Planning Indexing Project: Establishing a Correlation Between Indexing, Total Citations, and Library Holdings The Financial Counseling and Planning Indexing Project: Establishing a Correlation Between Indexing, Total Citations, and Library Holdings Paul J. Kelsey The researcher hypothesized that increasing the

More information

Journal Impact Evaluation: A Webometric Perspective 1

Journal Impact Evaluation: A Webometric Perspective 1 Journal Impact Evaluation: A Webometric Perspective 1 Mike Thelwall Statistical Cybermetrics Research Group, School of Technology, University of Wolverhampton, Wulfruna Street, Wolverhampton WV1 1LY, UK.

More information

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier

Scopus. Advanced research tips and tricks. Massimiliano Bearzot Customer Consultant Elsevier 1 Scopus Advanced research tips and tricks Massimiliano Bearzot Customer Consultant Elsevier m.bearzot@elsevier.com October 12 th, Universitá degli Studi di Genova Agenda TITLE OF PRESENTATION 2 What content

More information

Alphabetical co-authorship in the social sciences and humanities: evidence from a comprehensive local database 1

Alphabetical co-authorship in the social sciences and humanities: evidence from a comprehensive local database 1 València, 14 16 September 2016 Proceedings of the 21 st International Conference on Science and Technology Indicators València (Spain) September 14-16, 2016 DOI: http://dx.doi.org/10.4995/sti2016.2016.xxxx

More information

The use of bibliometrics in the Italian Research Evaluation exercises

The use of bibliometrics in the Italian Research Evaluation exercises The use of bibliometrics in the Italian Research Evaluation exercises Marco Malgarini ANVUR MLE on Performance-based Research Funding Systems (PRFS) Horizon 2020 Policy Support Facility Rome, March 13,

More information

Bibliometric measures for research evaluation

Bibliometric measures for research evaluation Bibliometric measures for research evaluation Vincenzo Della Mea Dept. of Mathematics, Computer Science and Physics University of Udine http://www.dimi.uniud.it/dellamea/ Summary The scientific publication

More information

Should author self- citations be excluded from citation- based research evaluation? Perspective from in- text citation functions

Should author self- citations be excluded from citation- based research evaluation? Perspective from in- text citation functions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 Should author self- citations be excluded from citation- based research evaluation? Perspective

More information

Bibliometric glossary

Bibliometric glossary Bibliometric glossary Bibliometric glossary Benchmarking The process of comparing an institution s, organization s or country s performance to best practices from others in its field, always taking into

More information

WOUTER GERRITSMA, VU UNIVERSITY

WOUTER GERRITSMA, VU UNIVERSITY PUBLISHING FOR IMPACT WOUTER GERRITSMA, VU UNIVERSITY AMSTERDAM @WOWTER CHANGING THEMES IN SCIENCE Was: Publish or perish Is: Publish be cited or perish 2 Publishing for Impact CONTENTS What is article

More information

Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison

Coverage of highly-cited documents in Google Scholar, Web of Science, and Scopus: a multidisciplinary comparison This is a post-peer-review, pre-copyedit version of an article published in Scientometrics. The final authenticated version is available online at: https://doi.org/10.1007/s11192-018-2820-9. Coverage of

More information

Bibliometrics and the Research Excellence Framework (REF)

Bibliometrics and the Research Excellence Framework (REF) Bibliometrics and the Research Excellence Framework (REF) THIS LEAFLET SUMMARISES THE BROAD APPROACH TO USING BIBLIOMETRICS IN THE REF, AND THE FURTHER WORK THAT IS BEING UNDERTAKEN TO DEVELOP THIS APPROACH.

More information

The Google Scholar Revolution: a big data bibliometric tool

The Google Scholar Revolution: a big data bibliometric tool Google Scholar Day: Changing current evaluation paradigms Cybermetrics Lab (IPP CSIC) Madrid, 20 February 2017 The Google Scholar Revolution: a big data bibliometric tool Enrique Orduña-Malea, Alberto

More information

A Correlation Analysis of Normalized Indicators of Citation

A Correlation Analysis of Normalized Indicators of Citation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 Article A Correlation Analysis of Normalized Indicators of Citation Dmitry

More information

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database

Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database Instituto Complutense de Análisis Económico Bibliometric Rankings of Journals Based on the Thomson Reuters Citations Database Chia-Lin Chang Department of Applied Economics Department of Finance National

More information

Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals 1

Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals 1 1 Microsoft Academic: A multidisciplinary comparison of citation counts with Scopus and Mendeley for 29 journals 1 Mike Thelwall, Statistical Cybermetrics Research Group, University of Wolverhampton, UK.

More information

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS 4th June 2018 WEB OF SCIENCE AND SCOPUS are bibliographic databases multidisciplinary databases citation databases CITATION DATABASES contain bibliographic records

More information

Enabling editors through machine learning

Enabling editors through machine learning Meta Follow Meta is an AI company that provides academics & innovation-driven companies with powerful views of t Dec 9, 2016 9 min read Enabling editors through machine learning Examining the data science

More information

Scientometric Profile of Presbyopia in Medline Database

Scientometric Profile of Presbyopia in Medline Database Scientometric Profile of Presbyopia in Medline Database Pooja PrakashKharat M.Phil. Student Department of Library & Information Science Dr. Babasaheb Ambedkar Marathwada University. e-mail:kharatpooja90@gmail.com

More information

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical)

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical) Citation Analysis Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical) Learning outcomes At the end of this session: You will be able to navigate

More information

DISCOVERING JOURNALS Journal Selection & Evaluation

DISCOVERING JOURNALS Journal Selection & Evaluation DISCOVERING JOURNALS Journal Selection & Evaluation 28 January 2016 KOH AI PENG ACTING DEPUTY CHIEF LIBRARIAN SCImago to evaluate journals indexed in Scopus Journal Citation Reports (JCR) - to evaluate

More information

hprints , version 1-1 Oct 2008

hprints , version 1-1 Oct 2008 Author manuscript, published in "Scientometrics 74, 3 (2008) 439-451" 1 On the ratio of citable versus non-citable items in economics journals Tove Faber Frandsen 1 tff@db.dk Royal School of Library and

More information

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library

USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING. Mr. A. Tshikotshi Unisa Library USING THE UNISA LIBRARY S RESOURCES FOR E- visibility and NRF RATING Mr. A. Tshikotshi Unisa Library Presentation Outline 1. Outcomes 2. PL Duties 3.Databases and Tools 3.1. Scopus 3.2. Web of Science

More information

Publishing Your Research

Publishing Your Research Publishing Your Research Writing a scientific paper and submitting to the right journal Vrije Universiteit Amsterdam November 2016 Publishing Your Research 2016 Page 2 Publishing Scientific Articles The

More information

CITATION ANALYSES OF DOCTORAL DISSERTATION OF PUBLIC ADMINISTRATION: A STUDY OF PANJAB UNIVERSITY, CHANDIGARH

CITATION ANALYSES OF DOCTORAL DISSERTATION OF PUBLIC ADMINISTRATION: A STUDY OF PANJAB UNIVERSITY, CHANDIGARH University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln November 2016 CITATION ANALYSES

More information

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education

INTRODUCTION TO SCIENTOMETRICS. Farzaneh Aminpour, PhD. Ministry of Health and Medical Education INTRODUCTION TO SCIENTOMETRICS Farzaneh Aminpour, PhD. aminpour@behdasht.gov.ir Ministry of Health and Medical Education Workshop Objectives Definitions & Concepts Importance & Applications Citation Databases

More information

Bibliometrics & Research Impact Measures

Bibliometrics & Research Impact Measures Bibliometrics & Research Impact Measures Show your Research Impact using Citation Analysis Christina Hwang August 15, 2016 AGENDA 1.Background 1.Author-level metrics 2.Journal-level metrics 3.Article/Data-level

More information

Battle of the giants: a comparison of Web of Science, Scopus & Google Scholar

Battle of the giants: a comparison of Web of Science, Scopus & Google Scholar Battle of the giants: a comparison of Web of Science, Scopus & Google Scholar Gary Horrocks Research & Learning Liaison Manager, Information Systems & Services King s College London gary.horrocks@kcl.ac.uk

More information

Citing and Reading Behaviours in High-Energy Physics. How a Community Stopped Worrying about Journals and Learned to Love Repositories

Citing and Reading Behaviours in High-Energy Physics. How a Community Stopped Worrying about Journals and Learned to Love Repositories arxiv:0906.5418 CERN-OPEN-2009-007 SLAC-PUB-13693 August 2009 Citing and Reading Behaviours in High-Energy Physics. How a Community Stopped Worrying about Journals and Learned to Love Repositories Anne

More information

Publishing research outputs and refereeing journals

Publishing research outputs and refereeing journals 1/30 Publishing research outputs and refereeing journals Joel Reyes Noche Ateneo de Naga University jrnoche@mbox.adnu.edu.ph Council of Deans and Department Chairs of Colleges of Arts and Sciences Region

More information

Rawal Medical Journal An Analysis of Citation Pattern

Rawal Medical Journal An Analysis of Citation Pattern Sounding Board Rawal Medical Journal An Analysis of Citation Pattern Muhammad Javed*, Syed Shoaib Shah** From Shifa College of Medicine, Islamabad, Pakistan. *Librarian, **Professor and Head, Forensic

More information

Citation Educational Researcher, 2010, v. 39 n. 5, p

Citation Educational Researcher, 2010, v. 39 n. 5, p Title Using Google scholar to estimate the impact of journal articles in education Author(s) van Aalst, J Citation Educational Researcher, 2010, v. 39 n. 5, p. 387-400 Issued Date 2010 URL http://hdl.handle.net/10722/129415

More information

CITATION CLASSES 1 : A NOVEL INDICATOR BASE TO CLASSIFY SCIENTIFIC OUTPUT

CITATION CLASSES 1 : A NOVEL INDICATOR BASE TO CLASSIFY SCIENTIFIC OUTPUT CITATION CLASSES 1 : A NOVEL INDICATOR BASE TO CLASSIFY SCIENTIFIC OUTPUT Wolfgang Glänzel *, Koenraad Debackere **, Bart Thijs **** * Wolfgang.Glänzel@kuleuven.be Centre for R&D Monitoring (ECOOM) and

More information

Open Access Determinants and the Effect on Article Performance

Open Access Determinants and the Effect on Article Performance International Journal of Business and Economics Research 2017; 6(6): 145-152 http://www.sciencepublishinggroup.com/j/ijber doi: 10.11648/j.ijber.20170606.11 ISSN: 2328-7543 (Print); ISSN: 2328-756X (Online)

More information

Lokman I. Meho and Kiduk Yang School of Library and Information Science Indiana University Bloomington, Indiana, USA

Lokman I. Meho and Kiduk Yang School of Library and Information Science Indiana University Bloomington, Indiana, USA Date : 27/07/2006 Multi-faceted Approach to Citation-based Quality Assessment for Knowledge Management Lokman I. Meho and Kiduk Yang School of Library and Information Science Indiana University Bloomington,

More information

Microsoft Academic is one year old: the Phoenix is ready to leave the nest

Microsoft Academic is one year old: the Phoenix is ready to leave the nest Microsoft Academic is one year old: the Phoenix is ready to leave the nest Anne-Wil Harzing Satu Alakangas Version June 2017 Accepted for Scientometrics Copyright 2017, Anne-Wil Harzing, Satu Alakangas

More information

Research Impact Measures The Times They Are A Changin'

Research Impact Measures The Times They Are A Changin' Research Impact Measures The Times They Are A Changin' Impact Factor, Citation Metrics, and 'Altmetrics' Debbie Feisst H.T. Coutts Library August 12, 2013 Outline 1. The Basics 2. The Changes Impact Metrics

More information

Scientific and technical foundation for altmetrics in the US

Scientific and technical foundation for altmetrics in the US Scientific and technical foundation for altmetrics in the US William Gunn, Ph.D. Head of Academic Outreach Mendeley @mrgunn https://orcid.org/0000-0002-3555-2054 Why altmetrics? http://www.stm-assoc.org/2009_10_13_mwc_stm_report.pdf

More information

Where to present your results. V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science

Where to present your results. V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science Visegrad Grant No. 21730020 http://vinmes.eu/ V4 Seminars for Young Scientists on Publishing Techniques in the Field of Engineering Science Where to present your results Dr. Balázs Illés Budapest University

More information

Research Ideas for the Journal of Informatics and Data Mining: Opinion*

Research Ideas for the Journal of Informatics and Data Mining: Opinion* Research Ideas for the Journal of Informatics and Data Mining: Opinion* Editor-in-Chief Michael McAleer Department of Quantitative Finance National Tsing Hua University Taiwan and Econometric Institute

More information

Citation analysis: State of the art, good practices, and future developments

Citation analysis: State of the art, good practices, and future developments Citation analysis: State of the art, good practices, and future developments Ludo Waltman Centre for Science and Technology Studies, Leiden University Bibliometrics & Research Assessment: A Symposium for

More information

VISIBILITY OF AFRICAN SCHOLARS IN THE LITERATURE OF BIBLIOMETRICS

VISIBILITY OF AFRICAN SCHOLARS IN THE LITERATURE OF BIBLIOMETRICS VISIBILITY OF AFRICAN SCHOLARS IN THE LITERATURE OF BIBLIOMETRICS Yahya Ibrahim Harande Department of Library and Information Sciences Bayero University Nigeria ABSTRACT This paper discusses the visibility

More information

Introduction to Citation Metrics

Introduction to Citation Metrics Introduction to Citation Metrics Library Tutorial for PC5198 Geok Kee slbtgk@nus.edu.sg 6 March 2014 1 Outline Searching in databases Introduction to citation metrics Journal metrics Author impact metrics

More information

Workshop Training Materials

Workshop Training Materials Workshop Training Materials http://libguides.nus.edu.sg/researchimpact/workshop Recommended browsers 1. 2. Enter your NUSNET ID and password when prompted 2 Research Impact Measurement and You Basic Citation

More information