I Had a Dream... about Uncitedness

Similar documents
Measuring the Impact of Electronic Publishing on Citation Indicators of Education Journals

In basic science the percentage of authoritative references decreases as bibliographies become shorter

Tranformation of Scholarly Publishing in the Digital Era: Scholars Point of View

Focus on bibliometrics and altmetrics

2013 Environmental Monitoring, Evaluation, and Protection (EMEP) Citation Analysis

EVALUATING THE IMPACT FACTOR: A CITATION STUDY FOR INFORMATION TECHNOLOGY JOURNALS


Citation Analysis in Research Evaluation

Bibliometric report

Cooperation between Turkish researchers and Oxford University Press. Avanos October 2017

Rawal Medical Journal An Analysis of Citation Pattern

Interpret the numbers: Putting e-book usage statistics in context

The Decline in the Concentration of Citations,

The use of bibliometrics in the Italian Research Evaluation exercises

THE USE OF THOMSON REUTERS RESEARCH ANALYTIC RESOURCES IN ACADEMIC PERFORMANCE EVALUATION DR. EVANGELIA A.E.C. LIPITAKIS SEPTEMBER 2014

Code Number: 174-E 142 Health and Biosciences Libraries

arxiv: v1 [cs.dl] 8 Oct 2014

GPLL234 - Choosing the right journal for your research: predatory publishers & open access. March 29, 2017

CITATION INDEX AND ANALYSIS DATABASES

Cited Publications 1 (ISI Indexed) (6 Apr 2012)

Using Bibliometric Analyses for Evaluating Leading Journals and Top Researchers in SoTL

DISCOVERING JOURNALS Journal Selection & Evaluation

Introduction to Citation Metrics

Scientometric Profile of Presbyopia in Medline Database

Bibliometric analysis of publications from North Korea indexed in the Web of Science Core Collection from 1988 to 2016

InCites Indicators Handbook

Publishing Scientific Research SIOMMS 2016 Madrid, Spain, October 19, 2016 Nathalie Jacobs, Senior Publishing Editor

Citation Analysis. Presented by: Rama R Ramakrishnan Librarian (Instructional Services) Engineering Librarian (Aerospace & Mechanical)

ABOUT ASCE JOURNALS ASCE LIBRARY

Developing library services to support Research and Development (R&D): The journey to developing relationships.

Bibliometric evaluation and international benchmarking of the UK s physics research

researchtrends IN THIS ISSUE: Did you know? Scientometrics from past to present Focus on Turkey: the influence of policy on research output

Can editorial peer review survive in a digital environment?

hprints , version 1-1 Oct 2008

EDITORIAL POLICY. Open Access and Copyright Policy

Research Evaluation Metrics. Gali Halevi, MLS, PhD Chief Director Mount Sinai Health System Libraries Assistant Professor Department of Medicine

Writing Styles Simplified Version MLA STYLE

How comprehensive is the PubMed Central Open Access full-text database?

Discussing some basic critique on Journal Impact Factors: revision of earlier comments

Embedding Librarians into the STEM Publication Process. Scientists and librarians both recognize the importance of peer-reviewed scholarly

How to Publish A scientific Research Article

DON T SPECULATE. VALIDATE. A new standard of journal citation impact.

Battle of the giants: a comparison of Web of Science, Scopus & Google Scholar

Publishing Your Research

Measuring Academic Impact

Web of Science Unlock the full potential of research discovery

Open Access Determinants and the Effect on Article Performance

BIG DATA IN RESEARCH IMPACT AMINE TRIKI CUSTOMER EDUCATION SPECIALIST DECEMBER 2017

Professor Birger Hjørland and associate professor Jeppe Nicolaisen hereby endorse the proposal by

BIBLIOMETRIC REPORT. Bibliometric analysis of Mälardalen University. Final Report - updated. April 28 th, 2014

Google Labs, for products in development:

Research metrics. Anne Costigan University of Bradford

THE TRB TRANSPORTATION RESEARCH RECORD IMPACT FACTOR -Annual Update- October 2015

Impact Factors: Scientific Assessment by Numbers

Introduction: Use of electronic information resources

Appropriate and Inappropriate Uses of Journal Bibliometric Indicators (Why do we need more than one?)

1. MORTALITY AT ADVANCED AGES IN SPAIN MARIA DELS ÀNGELS FELIPE CHECA 1 COL LEGI D ACTUARIS DE CATALUNYA

GUIDELINES FOR AUTHOR

Follow this and additional works at: Part of the Library and Information Science Commons

How to Write Great Papers. Presented by: Els Bosma, Publishing Director Chemistry Universidad Santiago de Compostela Date: 16 th of November, 2011

Citation Accuracy in Environmental Science Journals

Indexing in Databases. Roya Daneshmand Kowsar Medical Institute

International Journal of Library and Information Studies ISSN: Vol.3 (3) Jul-Sep, 2013

Keywords: Publications, Citation Impact, Scholarly Productivity, Scopus, Web of Science, Iran.

Bibliometrics and the Research Excellence Framework (REF)

PubMed, PubMed Central, Open Access, and Public Access Sept 9, 2009

College Libraries and Open Access: Expanding access to scholarly literature without breaking your budget

All academic librarians, Is Accuracy Everything? A Study of Two Serials Directories. Feature. Marybeth Grimes and

DOWNLOAD PDF 2000 MLA INTERNATIONAL BIBLIOGRAPHY OF BOOKS AND ARTICLES ON THE MODERN LANGUAGE AND LITERATURES

Bibliometrics & Research Impact Measures

FROM IMPACT FACTOR TO EIGENFACTOR An introduction to journal impact measures

Complementary bibliometric analysis of the Health and Welfare (HV) research specialisation

Scientific Publication

Predicting the Importance of Current Papers

Scientific Quality Assurance by Interactive Peer Review & Public Discussion

What is bibliometrics?

Citation-Based Indices of Scholarly Impact: Databases and Norms

Journal of Advanced Chemical Sciences

King's College STUDY GUIDE # 4 D. Leonard Corgan Library Wilkes-Barre, PA 18711

DOWNLOAD OR READ : THE LETTERS AND JOURNALS OF LORD BYRON PDF EBOOK EPUB MOBI

What is Web of Science Core Collection? Thomson Reuters Journal Selection Process for Web of Science

Alfonso Ibanez Concha Bielza Pedro Larranaga

1.1 What is CiteScore? Why don t you include articles-in-press in CiteScore? Why don t you include abstracts in CiteScore?

POLICY AND PROCEDURES FOR MEASUREMENT OF RESEARCH OUTPUT OF PUBLIC HIGHER EDUCATION INSTITUTIONS MINISTRY OF EDUCATION

Prices of U.S. and Foreign Published Materials

How to Choose the Right Journal? Navigating today s Scientific Publishing Environment

Journal of American Computing Machinery: A Citation Study

Web of Science Core Collection

Running a Journal.... the right one

Prices of U.S. and Foreign Published Materials

PBL Netherlands Environmental Assessment Agency (PBL): Research performance analysis ( )

Corso di dottorato in Scienze Farmacologiche Information Literacy in Pharmacological Sciences 2018 WEB OF SCIENCE SCOPUS AUTHOR INDENTIFIERS

Peter Ingwersen and Howard D. White win the 2005 Derek John de Solla Price Medal

Should the Journal of East Asian Libraries Be a Peer- Reviewed Journal? A Report of the Investigation and Decision

WEB OF SCIENCE JOURNAL SELECTION PROCESS THE PATHWAY TO EXCELLENCE IN SCHOLARLY COMMUNICATION

of Nebraska - Lincoln

On the Citation Advantage of linking to data

Establishing Eligibility As an Outstanding Professor or Researcher 8 C.F.R (i)(3)(i)

Periodical Usage in an Education-Psychology Library

International Journal of Modern Pharmaceutical Research (IJMPR)

Transcription:

I Had a Dream... about Uncitedness By Eugene Garfield The Scientist 12[14]:10, July 06, 1998 My first paper proposing the creation of the Science Citation Index (Science, 122(3159): 108-111, 1955) began with a quotation from P. Thomasson and J.C. Stanley: "The uncritical citation of disputed data by a writer, whether it be deliberate or not, is a serious matter. Of course, knowingly propagandizing unsubstantiated claims is particularly abhorrent, but just as many naïve students may be swayed by unfounded assertions presented by a writer who is unaware of the criticisms. Buried in scholarly journals, critical notes are increasingly likely to be overlooked with the passage of time, while the studies to which they pertain, having been reported more widely, are apt to be rediscovered." When the Science Citation Index (SCI ) was finally launched in the sixties, I dreamt that, one day, scholars would use it to routinely avoid the unwitting perpetuation of errors. Today I can still only hope that the World Wide Web and access to the SCI on Web of Science will make that dream come true. But, in the meantime, I can only groan when I see errors perpetuated year after year. One particular case concerns a news story by David P. Hamilton that appeared in Science seven years ago ( Science, 250:1331-2, 1990 and Science, 251:25, 1991). Hamilton zealously criticized scholarship in the sciences and social sciences, but especially in the arts and humanities, using citation frequency data. He concluded, without qualifications, that huge percentages of the scholarly literature were never cited. His misguided reports on uncitedness have unduly influenced many scholars and policy makers ever since. His claims continue to be cited even though David Pendlebury subsequently published a factually correct rebuttal in a letter in Science (251:1410-1411, 1991), which was published with several others under the title "Science, Citation, and Funding." Pertinent extracts of Pendlebury's letter follow: "Hamilton's two articles about the percentage of journal literature that remains uncited within five years of publication require comment and further explanation. The figures reported by Hamilton--47.4 percent uncited for the sciences, 74.7 percent for the social sciences, and 98.0 percent for the arts and humanities--are indeed correct. However, as Maxine Singer was quoted as saying in Hamilton's article, it is necessary to know what is in the numbers before interpreting them. "These statistics represent every type of article that appears in journals indexed by the Institute for Scientific Information in its [Citation Indexes]. The journals which ISI indexes contain not only articles, reviews, and notes, but meeting abstracts, editorials, obituaries, letters like this one, and other marginalia, which one might expect to be largely uncited. In 1984, the year of the data quoted by Hamilton, about 27

percent of the items indexed in the Science Citation Index were such marginalia. The comparable figures for the Social Sciences Citation Index and the Arts and Humanities Citation Index were at 48 percent and 69 percent, respectively. "If one analyzes the data more narrowly and examines the extent of uncited articles alone (this information was not yet available when Hamilton wrote his articles), the figures shrink, some more than others: 22.4 percent of 1984 science articles remained uncited by the end of 1988, as did 48.0 percent of social sciences articles and 93.1 percent of articles in arts and humanities journals. "The figures originally quoted by Hamilton seem to have been interpreted by many readers as some sort of measure of the health of U.S. science. The numbers, however, reflect a lack of citation of papers by authors the world over, not only those by U.S. researchers.... "If one restricts the analysis even further and examines the extent of uncited articles by U.S. authors alone, the numbers are even less "worrisome." Only 14.7 percent of 1984 science articles by U.S. authors were left uncited by the end of 1988. We estimate the share of uncited 1984 articles by non- U.S. scientists to be about 28 percent.... "A certain level of 'uncitedness' in journal literature is probably more an expression of the process of knowledge creation and dissemination than any sort of measure of performance. A trend toward more or less 'uncitedness,' however, might be meaningful. For the 1980s, we see no such trend in the scientific literature: the numbers are essentially flat, both for the United States alone and for the world.... "We hope this information clarifies the record and will end further misunderstanding or politicization of these statistics." As Pendlebury indicates, and as we all know, journals such as Science, Nature, and New England Journal of Medicine publish many high-impact research articles, but they also publish many other editorial items, such as letters, obituaries, and book reviews. All of these are indexed in the Science Citation Index. When one quantifies the output and utility of journals, research and review articles must be differentiated from the rest. When ISI calculates journal impact factors, only "substantive" items are included in the counts. So when one describes uncitedness, one must specify which editorial categories are included in the calculation. In Pendlebury's data, uncitedness is defined in terms of ISI's journal coverage. Since most authors sooner or later mention their own work in a review or as part of an ongoing series, it is indeed remarkable how many papers are never cited. Due to the cumulative character of science and scholarship, a great deal of the literature is cited but once, simply because supercedence is a fundamental characteristic of the literature. To get an idea of how often "onesies" occur, consider this: In a study by SCI for the years 1945 88, almost 56 percent of all types of publications (papers, books, etc.) were cited just once. However, that is

inflated, because many references include typographical errors or spelling or pagination variations that defy easy unification. A small group of journals account for more than 90 percent of significant research. The overwhelming majority of articles published in the 200 journals with highest cumulative impact (The Scientist, 12[3]:11-12, Feb. 2, 1998 and pages 12-13 of this issue) are cited within a few years of publication, and after five years, uncitedness is almost nonexistent. The Scientist 12[14]:10, July 06, 1998

Science, 251:1410-1411, 1991 Science, 251:1410-1411, 1991 David A. Pendlebury Letters to the Editor Science, Citation, and Funding Hamilton's two articles about the percentage of journal literature that remains uncited within 5 years of publication require comment and further explanation. The figures reported by Hamilton -- 47.4% uncited for the sciences, 74.7% for the social sciences, and 98.0% for the arts and humanities -- are indeed correct. However, as Maxine Singer was quoted as saying in Hamilton's first article, it is necessary to know what's in the numbers before interpreting them. These statistics represent every type of article that appears in journals indexed by the Institute for Scientific Information (ISI) in its Science Citation Index, Social Sciences Citation Index, and Arts & Humanities Citation Index. The journals' ISI indexes contain not only articles, reviews, and notes, but also meeting abstracts, editorials, obituaries, letters like this one, and other marginalia, which one might expect to be largely un-cited. In 1984, the year of the data quoted by Hamilton, about 27% of the items indexed in the Science Citation Index were such marginalia. The comparable figures for the social sciences and arts and humanities were 48% and 69%, respectively. If one analyzes the data more narrowly and examines the extent of uncited articles alone (this information was not yet available when Hamilton wrote his articles), the figures shrink, some more than others: 22.4% of 1984 science articles remained uncited by the end of 1988, as did 48.0% of social sciences http://garfield.library.upenn.edu/papers/pendlebury.html (1 of 3) [11/16/2001 4:48:42 PM]

Science, 251:1410-1411, 1991 articles and 93.1% of articles in arts and humanities journals. It ought to be pointed out that the book represents a considerably more important vehicle of communication in the social sciences and humanities than in the sciences. The figures given above reflect only the journal literature of the social sciences and arts and humanities. The figures originally quoted by Hamilton seem to have been interpreted by many readers as some sort of measure of the health of U.S. science. The numbers, however, reflect a lack of citation of papers by authors the world over-not only those by U.S. researchers. This point was raised in Hamilton's first article. If one restricts the analysis even further and examines the extent of uncited articles by U.S. authors alone, the numbers are even less "worrisome." Only 14.7% of 1984 science articles by U.S. authors were left un-cited by the end of 1988. We estimate the share of uncited 1984 articles by non-u.s. scientists to be about 28%. (Comparable figures for social sciences and arts and humanities articles by U.S. authors are not yet available.) A certain level of "uncitedness" in the journal literature is probably more an expression of the process of knowledge creation and dissemination than any sort of measure of performance. A trend toward more or less "uncitedness," however, might be meaningful. For the 1980s, we see no such trend in the scientific literature: the numbers are essentially flat, both for the United States alone and for the world. In the social sciences, however, we do detect a decrease in uncited papers -- from 49.7% for 1981 articles to 45.3% for 1985 articles. In the arts and humanities, the figure of 93% uuncited is fairly steady from 1981 through 1985. This, we hope, serves to illustrate the great range of statistics one can derive depending upon what "cut" is made from the ISI databases. For example, articles published in the highest impact journals like Science are almost never left uncited. We will be generating, over the coming months, http://garfield.library.upenn.edu/papers/pendlebury.html (2 of 3) [11/16/2001 4:48:42 PM]

Science, 251:1410-1411, 1991 article-only statistics, both U.S. and worldwide, for subdisciplines in the sciences, social sciences, and humanities, corresponding to the overall database statistics referred to by Hamilton in his second article. We have not yet produced a report on these statistics, but in light of the great interest in the numbers, we will now do so. We hope this information clarifies the record and will end further misunderstanding or politicalization of these statistics. David A. Pendlebury Research Department Institute for Scientific Information 3501 Market Street Philadelphia, PA 19104 david.pendlebury@isinet.com Articles by David Hamilton: David Hamilton, "Publishing by -- and for? -- the Numbers" Science, 250:1331-2, 1990 David Hamilton, "Research Papers: Who's Uncited Now?" Science, 251:25, 1991 http://garfield.library.upenn.edu/papers/pendlebury.html (3 of 3) [11/16/2001 4:48:42 PM]

Science, 250:1331-2, 1990 Science, 250:1331-2, 1990 David P. Hamilton Also see :. Hamilton DP, "Research Papers: Who's Uncited Now?"Science, 251:25, 1991. Pendlebury DA "Science, Citation, and Funding" (letter to the editor) Science 251:1410-1411, 1991 Publishing by -- and for? -- the Numbers New evidence raises the possibility that a majority of scientific papers make negligible contributions to knowledge Citations, according to the conventional wisdom, are the glue that binds a research paper to the body of knowledge in a particular field and a measure of the paper's importance. So what fraction of the world's vast scientific literature is cited at least once? Seventy percent? Eighty percent? Guess again. Statistics compiled by the Philadelphia-based Institute for Scientific Information (ISI) indicate that 55% of the papers published between 1981 and 1985 in journals indexed by the institute received no citations at all in the 5 years after they were published. The figure was derived by ISI analyst David Pendlebury, who at the request of Science searched ISI's extensive database of scientific citations. And that's the good news. ISI's database covers only the top science and social science journals -- some 4500 out of nearly' 74,000 scientific titles listed in Bowker / Ulrich's database, a commercial listing of all periodicals. "The conventional wisdom in the field http://garfield.library.upenn.edu/papers/hamilton1.html (1 of 5) [11/16/2001 4:49:12 PM]

Science, 250:1331-2, 1990 is that 10% of the journals get 90% of the citations," says Pendlebury. "These are the journals that get read, cited, and have an impact." Even those papers that do get cited aren't cited very often. An earlier ISI study of articles in the hard sciences (including medicine and engineering) published between 1969 and 1981 revealed that only 42% received more than one citation. (Because of database limitations at the time, that study didn't examine the number of uncited papers.) If a similar trend holds for 1981 to 1985, then as much as 80% of papers published during that period have never been cited more than once. Moreover, self-citation -- a practice in which authors cite their own earlier work -- accounts for between 5% and 20% of all citations, according to Pendlebury. Does this mean that more than half -- and perhaps more than three-quarters -- of the scientific literature is essentially worthless? Of nearly 20 academicians, federal officials, and science policy analysts contacted by Science, few were willing to state the case so harshly. But a majority agreed that the high percentage of uncited papers is certainly reason for concern. Chief among the explanations offered was that researchers are publishing far too many inconsequential papers in order to pad their resumes. A typical reactions is that voiced by Robert Park, Washington director of the American Physical Society; "My God! That is fascinating -- it's an extraordinarily large number. It really does raise some serious questions about what it is we're doing." Ray Bowen, assistant director for engineering at the National Science Foundation, agreed. "It does suggest that a lot of work is generally without utility in the short-term sense." Similarly, Frank Press, president of the National Academy of Sciences, noted that "There are obvious concerns which are worrisome namely that the work is redundant, it's me-too type of follow-on papers, or the journals are printing too much." A few officials, however, cautioned against interpreting the uncitedness figure as evidence of overpublication on two grounds: that even uncited papers can influence other researchers, and that the http://garfield.library.upenn.edu/papers/hamilton1.html (2 of 5) [11/16/2001 4:49:12 PM]

Science, 250:1331-2, 1990 figure may be skewed because ISI databases include some foreign journals with minimal impact. "Maybe 10,000 people used the particular data from [an uncited article] because it was just sent out as an informal paper, or the numbers appeared in the traffic sent out over an electronic network," said Charles Brownstein, NSF's assistant director for computer and information science. "I just have no way to even begin to evaluate [the 55% figure]." Maxine Singer, president of the Carnegie Institution, posed her objection as a question: "So this includes a lot of journals published in countries of minimal scientific effort? Its very hard to evaluate a number unless you know what's in it." But even some who were reluctant to assign much importance to the statistic admitted that it surprised them. "It strikes me as a high figure -- I would have guessed one-third," said William Raub, acting director of the National Institutes of Health. "But I don't know what to make of it." Timothy Springer, a Harvard cancer researcher, was more direct. "It is higher than I'd have expected," he said. "It indicates that too much is published. A lot of us think too much is published." If Springer is right, the publishing industry is at least partly responsible. The number of scholarly journals in all fields (scientific and others) has risen from 70,000 to 108,590 over the past 20 years, according to the Bowker / Ulrich's database. Crunched by rising subscription prices and the sheer number of titles, libraries have been unable to keep up with the flood of information. The average member of the Association of Research Libraries now holds only about 27,000 titles, about 26% of the total available. To critics of the academic promotion system like University of Michigan president James Duderstadt, the growing number of journals and the high number of uncited articles simply confirm their suspicion that academic culture encourages spurious publication. "It is pretty strong evidence of how fragmented scientific work has become, and the kinds of pressures which drive people to stress number of publications rather than quality of publications," Duderstadt said. Most of that pressure is rooted in the struggle for grants and promotions. "The obvious interpretation is http://garfield.library.upenn.edu/papers/hamilton1.html (3 of 5) [11/16/2001 4:49:12 PM]

Science, 250:1331-2, 1990 that the publish or perish syndrome is still operating in force," said David Helfand, chairman of the astronomy department at Columbia University. (Helfand is best known outside his field for refusing to accept a tenured appointment at Columbia, instead preferring to work under a renewal five-year contract.) "You get a stack of 60 papers in the mail when you're on a tenure committee, and its sort of stupid, because you know you're not going to read them all." Allen Bard, editor of the Journal of the American Chemical Society, added: "In many ways, publication no longer represents a way of communicating with your scientific peers, but a way to enhance your status and accumulate points for promotion and grants." For just this reason, some universities have begun limiting the number of papers they will accept for evaluation. The Harvard Medical School, whose promotion committees will only review applicants' 5 to 10 most significant papers, is the most celebrated example, but other schools and some federal agencies seem to be following suit. New rules at NSF, for instance, allow scientists to submit no more than five publications with their grant applications. Even so, it may be a while before this trend moves beyond elite research universities. "At the state colleges and universities, where they believe publication is their road to credibility, there's still a great emphasis on the number of publications," says Vito Perrone, a Harvard School of Education researcher who has studied academic publishing for the Carnegie Foundation for the Advancement of Teaching. Pendlebury says he plans further analysis of the citation data within the next few months. In particular, he intends to examine how the percentage of uncited papers varies between disciplines and between journals put out by commercial and nonprofit publishers, as well as the frequency of uncited papers in upper echelon journals such as Nature, Science, Cell, the New England Journal of Medicine, and so forth. So far, there is only a hint as to what further analysis will reveal -- and it's bad news for social scientists. A preliminary ISI study conducted on papers published in the hard sciences in http://garfield.library.upenn.edu/papers/hamilton1.html (4 of 5) [11/16/2001 4:49:12 PM]

Science, 250:1331-2, 1990 1984 revealed that only 40% of them received no citations in the 4 years following publication, a fact which suggests that social science papers go uncited at a rate much greater than 55%. One consequence of this phenomenon is that many researchers have become deeply suspicious of articles not published in so-called first-tier journals. "I routinely have to go into the 'deep literature' --those journals I no longer have time to read on a daily basis -- and it is frequently a waste of time," says MIT biology professor Richard Young. If the bottom 80% of the literature "just vanished," he says, "I doubt the scientific enterprise would suffer." The ISI statistics would seem to give academics, university administrators, and government officials a great deal to think about. David P. Hamilton http://garfield.library.upenn.edu/papers/hamilton1.html (5 of 5) [11/16/2001 4:49:12 PM]

Science, 251:25, 1991 Science, 251:25, 1991 David P. Hamilton Also see. Pendlebury DA "Science, Citation, and Funding" (letter to the editor) Science 251:1410-1411, 1991. Hamilton DP "Publishing by -- and for? -- the Numbers" Science, 250:1331-2, 1990 Research Papers: Who's Uncited Now? Scientists who like to one-up their colleagues in other disciplines can now do so in a new way. Last month, David Pendlebury of the Philadelphia-based Institute for Scientific Information came up with the startling conclusion that 55% of the papers published in journals covered by ISI's citation database did not receive a single citation in the 5 years after they were published (Science, 7 December, p. 1331). Now Pendlebury has extended his analysis by looking at how the "uncitedness rate" varies among scientific disciplines. Neither engineering researchers nor social scientists are likely to be happy with the results. In this latest study, Pendlebury looked only at papers published in 1984 and the citations they accumulated through 1988. (ISI's database covers the top 10% of all scientific journals published worldwide.) When he grouped the data into broad categories, Pendlebury found that physics and chemistry had the lowest rates of uncitedness -- 36.7% and 38.8% of the papers published in those disciplines, respectively, were not cited at all in the 4 years following publication. Close behind were the biological sciences (41.3%), the geosciences (43.6%), and medicine (46.4%). These subjects all fall below the uncitedness average of 47.4% for the so-called hard sciences -- all scientific disciplines including engineering and medicine, but excluding the social sciences. (Pendlebury had first reported the hard science average as 40%; the later number, he says, is "more systematically generated.") The figure for engineering, however, is above that average -- well above it, in fact. More than 72% of all papers published in http://garfield.library.upenn.edu/papers/hamilton2.html (1 of 3) [11/16/2001 4:49:51 PM]

Science, 251:25, 1991 engineering had no citations at all. Pendlebury says he is at a loss to explain this anomaly, although he suggests that "sociological factors" might influence the way engineering researchers cite each other's work. Within these broad categories, there is a wide variation among individual sub-disciplines. Atomic, molecular, and chemical physics, a field in which onlv 9.2% of articles go uncited, took top honors. Next was virology, with an uncitedness rate of 14.0%. In rapid succession came particle and field physics (16.7%) inorganic and nuclear chemistry ( 17.0%), nuclear physics (17.3%), fluid and plasma physics (18.2%), organic chemistry (18.6%), condensed matter physics (19.1%), and biochemistry and molecular biology, (19.4%). Among fields that didn't fare so well: electrochemistry (64.6%), developmental biology (61.5%), optics (49.1%), and acoustics (40.1%). As for engineering, every field showed high rates of uncitedness, with civil engineering highest at 78.0%. Next came mechanical (76.8%), aerospace (76.8%), electrical (66.2%) chemical (65.8%), and biomedical (59.1%) engineering. A handful of other applied fields showed similarly high rates: construction and building technology (84.2%), energy and fuels (80.3%), applied chemistry (78.0%), materials science-paper and wood (77.6%), metallurgy and mining (75.2%), and materials science-ceramics (72.8%). Papers published in the social sciences fared no better. Political science (90.1%), international relations (82.8%), language and linguistics (79.8%), anthropology (79.5%), sociology (77.4%), business (76.6%), and archeology (76%) all exceeded the social science average of 74.7%. Social psychology articles, on the other hand, seem to be relatively highly cited; only 35.4% received no citations at all. But scientists, social and otherwise, can take heart. Within the arts and humanities (where admittedly citation is not so firmly entrenched), uncitedness figures hit the ceiling. Consider, for example, theater (99.9%), American literature (99.8%), architecture (99.6%), and religion (98.2%). And, in one curious anomaly, articles in history (95.5%) and philosophy (92.1%) were relatively uncited, while those in history and philosophy of science (29.2%) were not. DAVID P. HAMILTON Also see:http://garfield.library.upenn.edu/papers/pendlebury.html http://garfield.library.upenn.edu/papers/hamilton2.html (2 of 3) [11/16/2001 4:49:51 PM]

Science, 251:25, 1991 http://garfield.library.upenn.edu/papers/hamilton1.html http://garfield.library.upenn.edu/papers/hamilton2.html (3 of 3) [11/16/2001 4:49:51 PM]