Level 1 COUNTER Compliant Vendor Statistics are a Reliable Measure of Journal Usage


  • Gaby Haddow Curtin University of Technology




A review of:

Duy, Joanna and Liwen Vaughan. “Can Electronic Journal Usage Data Replace Citation Data as a Measure of Journal Use? An Empirical Examination.” The Journal of Academic Librarianship 32.5 (Sept. 2006): 512-17.


Objective – To identify valid measures of journal usage by comparing citation data with print and electronic journal use data.

Design – Bibliometric study.

Setting – Large academic library in Canada.

Subjects – Instances of use were collected from 11 print journals of the American Chemical Society (ACS), 9 print journals of the Royal Society of Chemistry (RSC), and electronic journals in chemistry and biochemistry from four publishers – ACS, RSC, Elsevier, and Wiley. ACS, Elsevier, and Wiley journals in chemistry-related subject areas were sampled for Journal Impact Factors and citations data from the Institute for Scientific Information (ISI).

Methods – Journal usage data were collected to determine if an association existed between: (1) print and electronic journal use; (2) electronic journal use and citations to journals by authors from the university; and (3) electronic journal use and Journal Impact Factors.

Between June 2000 and September 2003, library staff recorded the re-shelving of bound volumes and loose issues of 20 journal titles published by the ACS and the RSC.

Electronic journal usage data were collected for journals published by ACS, RSC, Elsevier, and Wiley within the ISI-defined chemistry and biochemistry subject area. Data were drawn from the publishers’ Level 1 COUNTER compliant usage statistics. These data equate 1 instance of use with a user viewing an HTML or PDF full text article. The period of data collection varied, but at least 2.5 years of data were collected for each publisher.

Journal Impact Factors were collected for all ISI chemistry-related journals published by ACS, Elsevier, and Wiley for the year 2001. Library Journal Utilization Reports (purchased from ISI) were used to determine the number of times researchers at the university cited journals in the same set of chemistry-related journals over the period 1998 to 2002. The authors call this “local citation data.” (512)

The results from electronic journal use were also analysed for correlation with the total number of citations, as reported in the Journal Citation Reports, for each journal in the sample.

Main results – The study found a significant correlation between the results for print journal and electronic journal usage. A similar finding was reported for correlation between electronic journal usage data and local citation data. No significant association was found between Journal Impact Factors and electronic journal usage data. However, when an analysis was conducted for the total number of citations to the journals (drawn from the Journal Impact Factor calculations in Journal Citation Reports) and electronic journal use, significant correlations were found for all publishers’ journals.

Conclusion – Within the fields of chemistry and biochemistry, electronic journal usage data provided by publishers are an equally valid method of determining journal usage as print journal re-shelving data. The results of the study indicate this association is valid even when print journal subscriptions have ceased. Local citation data (the citations made by researchers at the institution being studied) also provide a valid measure of journal use when compared with electronic journal usage results. Journal Impact Factors should be used with caution when libraries are making journal collection decisions.


Download data is not yet available.

Author Biography

Gaby Haddow, Curtin University of Technology

Divisional Librarian, Humanities Research & Learning Services University Library Curtin University of Technology Perth, WA AUSTRALIA




How to Cite

Haddow, G. (2007). Level 1 COUNTER Compliant Vendor Statistics are a Reliable Measure of Journal Usage. Evidence Based Library and Information Practice, 2(2), 84–86. https://doi.org/10.18438/B83G6S



Evidence Summaries