Shortcut Navigation:

Citation Metrics

Understanding and Finding Citation Metrics (June 2008)

Introduction

Citation metrics are statistics gathered on the number of times an article has been cited by other articles. There are several databases that gather these statistics, and they are described in more detail below. These statistics are gathered for two main purposes, to assess the quality of a researcher or to assess the quality of a journal.

For example, one might look at the citation count for a particular researcher (which would be the grand total of all the times that researcher's articles have been cited in other articles). This metric might be used to compare and judge the quality of that researcher's work and productivity with other researchers. The assumption is that researchers with high citation counts are writing articles that describe significant research (therefore this research is likely to be cited frequently by other authors), and/or the researcher is highly productive because they are publishing many articles for other researchers to cite. There are also more complex metrics used to evaluate the quality of an individual's research, such as the h-Index and the g-Index. Both of these indices involve more complicated calculations of citation counts in an attempt to provide a more accurate evaluation of quality and output. More information about these indices is available here:

http://www.nature.com/nature/journal/v436/n7053/full/436900a.html
http://en.wikipedia.org/wiki/Hirsch_number
http://en.wikipedia.org/wiki/G-index

As mentioned above, citation metrics are also used to assess the quality of a journal. A journal's impact factor is basically a measure of the number of times that articles published in that journal are cited, but is calculated with a bit more complexity than that; it measures the "ratio between citations and recent citable items published "and "is calculated by dividing the number of current year citations to the source items published in that journal during the previous two years." The assumption is that if a journal is cited frequently, it must be publishing important, high-quality, or ground-breaking research. For these reasons, it is more prestigious for researchers to publish in journals with a high impact factor. The company Thomson Scientific (a.k.a. Institute for Scientific Information, or ISI) developed this measurement, and more information is available here:

http://scientific.thomson.com/free/essays/journalcitationreports/impactfactor/
http://en.wikipedia.org/wiki/Impact_factor

These metrics are important because the number of times a researcher's work has been cited, and the impact factor of the journals that a researcher publishes in, are often used to evaluate that researcher's merits for applications for academic positions, tenure review, or promotions.

Finding Citation Metrics for an Individual

Two subscription-based indexes, Web of Science and Scopus, are notable for tracking article citations. The AMNH Library does not subscribe to either of these indexes, but they are often available at university libraries, and some coverage of Web of Science is available at New York Public Library (see http://www.nypl.org/databases/index.cfm?act=3&id=896). Features of these two indexes, as well as Google Scholar, are described below. Google Scholar is free to use and available here: http://scholar.google.com.

Web of Scienceoffers two ways to obtain citation counts for a researcher. One way is to conduct an author search on a researcher (it is useful to first use the author index to choose all name variants) and then once the search has been conducted, click on the link to "Create Citation Report." This report includes a sum of the times the researcher's articles have been cited, as well as a more detailed analysis, including graphs and a calculation of the researcher's h-Index. However, take note of the announcement at the top of the report:

"This report reflects citations to source items indexed within Web of Science. Perform a Cited Reference Search to include citations to items not indexed within Web of Science."

So, this report only includes data compiled from those publications that Web of Science formally indexes, which excludes books, grey literature, journals outside the subject scope of the Web of Science indexing policy, etc. It is useful to note, for example, that the Museum's in-house publication, American Museum Novitates, only began being indexed in Web of Science in 2004, so a citation report would exclude citation analysis of articles published in the Novitates before 2004.

For a more complete picture of a researcher's citation counts, a Cited Reference Search should be conducted. Choose the tab for conducting a Cited Reference Search and enter the researcher's name in the search box to conduct an author search. Again, it is useful to use the author thesaurus to search for name variants. This search will result in a more thorough list of that researcher's works, and the numbers listed in the column entitled "Citing Articles" can be added together to produce a more accurate number of citation counts. Unfortunately such a search will not provide a citation report that includes any analysis of results, such as the h-Index.

Scopusalso tracks citations, but with shorter coverage; indexing for journals in the database only goes back to 1996 (for Web of Science indexing for some journals goes back as far as 1900). To check a researcher's citation check, simply click the tab to conduct an author search, or perform a regular search entering the author's name and then selecting "authors" from the drop-down list to designate the appropriate field to search in. Either way, the next step is to select the results to be analyzed, and then click the "Citation Tracker" button. In Scopus it is not necessary to try to collect all variants of a researcher's name, as this database provides an author identifier feature, described as follows:

"Scopus Author Identifier uses an algorithm that matches author names based on their affiliation, address, subject area, source title, dates of publication citations, and co-authors. When you search, this feature returns documents written by that author, even when an author is cited differently."

Like the "Citation Report" in Web of Science, the "Citation Tracker" in Scopus provides a total sum of citation counts as well as the h-Index and additional analysis, but only for those publications that are formally indexed in Scopus. There is no way to see additional citation counts for non-indexed publications as is possible when using the Cited Reference Search in Web of Science.

Google Scholardoes not provide citation analysis tools like the two indexes described above. However, it is free. When searching for a researcher, each resulting citation provides a count of papers that article is "Cited by." To find as many articles as possible in order to add up these citation counts, it is important to thoroughly search on variants of the researcher's name. A good start is to try a phrase search using initials both before and after the surname, as in this search: "Smith MJ" OR "MJ Smith" - but it is not a bad idea to try other variants, being careful to omit from the citation count those articles that are by another author with a similar name.

Finding a Journal's Impact Factor

The traditional source for finding a journal's impact factor is Journal Citation Reports, or JCR, another product of Thomson Scientific and available by subscription only. The AMNH Library does not subscribe to JCR. However, it is usually available from university libraries and can also be accessed from New York Public Library through their Web of Science subscription. To search this resource, simply type in the name of the journal title, and a report with the impact factor will be presented (assuming a matching title is found). It is also possible to look at a list of journal titles by subject area in order to quickly compare their impact factor rankings. For more information about Journal Citation Reports, and journal impact factors, take a look at these sites:

http://scientific.thomsonreuters.com/products/jcr/
http://en.wikipedia.org/wiki/Impact_factor

An alternative resource for determining a journal's importance is Eigenfactor.org available at http://www.eigenfactor.org/index.php. This free resource provides two scores for a journal: "Article Influence (AI): a measure of a journal's prestige based on per article citations and comparable to Impact Factor" and "Eigenfactor (EF): A measure of the overall value provided by all of the articles published in a given journal in a year." For more information about how scores are calculated and how Eigenfactor was developed, take a look at the Eigenfactor website (provided above) and this article: "Eigenfactor: Measuring the value and prestige of scholarly journals".

Controversy Over the Use of Citation Metrics

The use of citation metrics is controversial for a number of reasons. For one, it is difficult to obtain accurate metrics on a particular researcher because the scope and coverage of databases like Web of Science, Scopus, and Google Scholar is not exhaustive. The indexing for these databases may not go back very far in time, nor are they comprehensive in the publications they index; therefore, results will vary. Also, researchers who have changed their name, or have been inconsistent with the use of their initials in the byline of their published works, or who simply have a name that is common, will be more difficult to analyze. (The Author Identifier developed by Scopus helps to eliminate this problem.)

Furthermore, the validity of citation metrics as a measure of worth has been criticized on a number of fronts. One controversy is that a researcher might be frequently cited because their research is contentious, not because it is high quality, useful research. Also, the use of citation metrics as an evaluation tool penalizes those researchers who have long-term projects that result in few publications, even though such long-term studies might be very important. Furthermore, comparing citation metrics of researchers across disciplines might be ill-advised since research and publication patterns vary by discipline. These are just a few issues that have been identified as potential problems. For more information, take a look at the following article on the h-Index which describes how it is weighted in order to address certain problems, and which issues are still left unresolved: http://en.wikipedia.org/wiki/Hirsch_number. Some of the pros and cons of Google Scholar vs. Web of Science for citation analysis are discussed in this article: http://www.harzing.com/h_indexjournals.htm. And, for more information about the controversies surrounding metrics for journal impact factor, see: http://en.wikipedia.org/wiki/Impact_factor.

American Museum of Natural History

Central Park West at 79th Street
New York, NY 10024-5192
Phone: 212-769-5100

Open daily from 10 am - 5:45 pm
except on Thanksgiving and Christmas
Maps and Directions