The Impact of Data Source on the Ranking of Computer Scientists Based on Citation Indicators: A Comparison of Web of Science and Scopus.
DOI:
https://doi.org/10.29173/istl1596Abstract
Conference proceedings represent a large part of the literature in computer science. Two Conference Proceedings Citation Index databases were merged with Web of Science in 200: but very few studies have been conducted to evaluate the effect of that merger of databases on citation indicators in computer science in comparison to other databases. This study explores whether or not the addition of the Conference Proceedings Citation Indexes to Web of Science has changed the citation analysis results when compared to Scopus. It compares the citation data of 25 randomly selected computer science faculty in Canadian universities in Web of Science (with Conference Proceedings Citation Indexes) and Scopus. The results show that Scopus retrieved considerably more publications including conference proceedings and journal articles. Scopus also generated higher citation counts and h-index than Web of Science in this field, though relative citation rankings from the two databases were similar. It is suggested that either database could be used if a relative ranking is sought. If the purpose is to find a more complete or higher value of citation counting or h-index, Scopus is preferable. It should be noted that no matter which source is used, citation analysis as a tool for research performance assessment must be constructed and applied with caution because of its technological and methodological limitations. [ABSTRACT FROM AUTHOR]
Downloads
References
Bar-Ilan, J. 2006. An ego-centric citation analysis of the works of Michael O. Rabin based on multiple citation indexes. Information Processing & Management 42(6):1553-66.
Bar-Ilan, J. 2010. Web of Science with the Conference Proceedings Citation Indexes: The case of computer science. Scientometrics 83(3):809-24.
Butler, L. 2007. Assessing university research: A plea for a balanced approach. Science and Public Policy 34(8):565-74.
Case, D.O. and Higgins, G.M. 2000. How can we investigate citation behavior? A study of reasons for citing literature in communication. Journal of the American Society for Information Science and Technology 51(7):635-45.
Cohoon, J.M., Shwalb, R. and Chen, L.-Y. 2003. Faculty turnover in CS departments. SIGCSE Bulletin 35(1): 108-12
Cole, S., Cole, J.R., and Simon, G.A. 1981. Chance and consensus in peer review. Science 214(4523):881-6.
Cronin, B. 2000. Semiotics and evaluative bibliometrics. Journal of Documentation 56(4):440-53.
De Sutter, B. and Van Den Oord, A. 2012. To be or not to be cited in computer science. Communications of the ACM 55(8):69-75.
Elsevier. Content Overview: Scopus. [Updated 2013]. [Internet] [cited 2013 November 25]. Available from: http://www.elsevier.com/online-tools/scopus/content-overview .
Feltes, C., Gibson, D.S., Miller, H., Norton, C., and Pollock, L. 2012. Envisioning the future of science libraries at academic research institutions. [cited 2013 September 15]. Available from {https://repository.lib.fit.edu/handle/11141/10}
Franceschet, M. 2010. A comparison of bibliometric indicators for computer science scholars and journals on web of science and google scholar. Scientometrics 83(1):243-58.
Garfield, E. 1955. Citation indexes for science - new dimension in documentation through association of ideas. Science 122(3159):108-11.
Goodrum, A., McCain, K., Lawrence, S., and Giles, C. 2001. Scholarly publishing in the Internet age: A citation analysis of computer science literature. Information Processing & Management 37(5):661-75.
Hirsch, J. 2005. An index to quantify an individual's scientific research output. Proceedings of the National Academy of Sciences of the United States of America 102(46):16569-72.
Horrobin, D.F. 1990. The philosophical basis of peer review and the suppression of innovation. Journal of American Medical Association 263(10):1438-41.
Jacso, P. 2008. The pros and cons of computing the h-index using scopus. Online Information Review 32(4):524-35.
Jacso, P. 2009. Database source coverage: Hypes, vital signs and reality checks. Online Information Review 33(5):997-1007.
Jacso, P. 2010. Pragmatic issues in calculating and comparing the quantity and quality of research through rating and ranking of researchers based on peer reviews and bibliometric indicators from Web of Science, Scopus and Google Scholar. Online Information Review 34(6):972-82.
Jacso, P. 2011. The h-index, h-core citation rate and the bibliometric profile of the Scopus database. Online Information Review 35(3):492-501.
Meho, L.I. and Rogers, Y. 2008. Citation counting, citation ranking, and h-index of human-computer interaction researchers: A comparison of scopus and web of science. Journal of the American Society for Information Science and Technology 59(11):1711-26.
Moed, H.F. 2007. The future of research evaluation rests with an intelligent combination of advanced metrics and transparent peer review. Science and Public Policy 34(8):575-83.
Moed, H.F. and Visser, M.S. 2007. Developing bibliometric indicators of research performance in computer science: An exploratory study. CWTS, Leiden. [Internet]. [cited 2013 August 25]. Available from http://www.Cwts.nl/pdf/NWO_Inf_Final_Report_V_210207.Pdf
Smith, R. 1988. Problems with peer review and alternatives. British Medical Journal 296(6624):774-7.
University Rankings. [Updated 2013]. [Internet]. Macleans. [cited 2013 May 02]. Available from: http://oncampus.macleans.ca/education/rankings/
Usée, C., Larivière, V., and Archambault, É. 2008. Conference proceedings as a source of scientific information: A bibliometric analysis. Journal of the American Society for Information Science and Technology 59(11):1776-84.
Van Raan, A.F. 1996. Advanced bibliometric methods as quantitative core of peer review based evaluation and foresight exercises. Scientometrics 36(3):397-420.
Van Raan, A.F.J. 2005. Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics 62(1):133-43.
Wainer, J., Przibisczki De Oliveira, H., and Anido, R. 2010. Patterns of bibliographic references in the ACM published papers. Information Processing and Management 47(1):135-42.
Wainer, J., Billa, C. and Goldenstein, S. 2011. Invisible work in standard bibliometric evaluation of computer science. Communications of the ACM 54(5):141-6.
Wainer, J., Eckmann, M., Goldenstein, S., and Rocha, A. 2013. How productivity and impact differ across computer science subareas: How to understand evaluation criteria for CS researchers. Communications of the ACM 56(8):67-73.
Weingart, P. 2005. Impact of bibliometrics upon the science system: Inadvertent consequences? Scientometrics 62(1):117-31.
Zhao, D. and Logan, E. 2002. Citation analysis using scientific publications on the web as data source: A case study in the XML research area. Scientometrics 54(3):449-72.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2014 Li Zhang

This work is licensed under a Creative Commons Attribution 4.0 International License.
While ISTL has always been open access and authors have always retained the copyright of their papers without restrictions, articles in issues prior to no.75 were not licensed with Creative Commons licenses. Since issue no. 75 (Winter 2014), ISTL has licensed its work through Creative Commons licenses. Please refer to the Copyright and Licensing Information page for more information.