Research Article
Academic Librarians’ Knowledge of Bibliometrics and
Altmetrics
Tara Malone
Assistant Professor and
Librarian
Department of Health
Sciences Library and Information Management
Robert M. Bird Library
University of Oklahoma
Health Sciences Center
Oklahoma City, Oklahoma,
United States of America
Email: tara-malone@ouhsc.edu
Susan Burke
Associate Professor
University of Oklahoma
School of Library and Information Studies
Norman, Oklahoma, United
States of America
Email: sburke@ou.edu
Received: 18 Apr. 2016 Accepted:
9 July 2016
2016 Malone and Burke. This is an Open
Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To measure
the knowledge and opinions that academic librarians have of established and
emerging research metrics.
Methods – An online
survey was distributed to all academic librarians in Oklahoma during Summer
2015.
Results – Librarians
were less familiar with altmetrics than with bibliometrics, but they viewed
altmetrics as effective and were interested in receiving training to learn more
about them. Librarians who had been in the profession for over five years knew
more about both bibliometrics and altmetrics than newer librarians.
Conclusions – Technological
advances and changes in the ways that research products are shared have led to
the possibility of and need for new ways of measuring research impact.
Altmetrics have emerged to fill this need, but academic librarians need more
familiarity and training to be able to fulfill a role as providers of these
metrics.
Introduction
With the advent of social media, digital publishing, and born-digital
research, scholarly research impact is changing. Traditional methods of
evaluating research impact, such as journal impact factor (JIF) and citation
counts, have long served as benchmarks of research productivity. More recently,
alternative metrics for assessing impact, altmetrics, have emerged. In addition
to citation counts, altmetrics track the impact of individual research articles
and other forms of scholarly output via media attention, article views and
downloads, database inclusion, and more (Cooper, 2015).
The roles of academic librarians are evolving to include the provision
of bibliometrics services to researchers, including both established and
emerging measures. As there are more tools available for tracking altmetrics,
yet no established standards for conducting altmetrics analysis, it is
challenging for academic librarians to help investigators and students
understand and use these new measures to complement traditional metrics. While
the literature strongly advocates for academic librarians to provide scholarly
communication metrics, the fluency of librarians in these methods has yet to be
established. This study aims to provide data on this topic.
Literature Review
Research Impact, Old and New
Citation analysis and the JIF were conceived by Eugene
Garfield and are widely considered the forerunners of modern bibliometrics (Carpenter, 2014; Herther, 2013).
Originally intended to assist librarians in assessing journal subscriptions,
the JIF is now commonly used during tenure and promotion review to assess the
quality of a researcher’s work by making assumptions about the value of
publications in part by whether they appeared in high impact journals. This
common misapplication of the JIF has mired the metric in controversy.
Criticisms of the JIF include “gaming” the numbers through self-citation;
compulsory citations imposed at the behest of journals; questions of
mathematical validity; failures to replicate the metric’s calculations; and
lack of comparability between disciplines (Bollen, Van de Sompel, Smith, & Luce, 2005; Brown,
2014; Carpenter, 2014; Neylon & Wu, 2009). Much of
the JIF is derived from citations to a small percentage of its articles, and a
lack of context surrounds most citation counts (Belter, 2015).
Measuring research quality based on citation counts
can be problematic due to the years it takes for a citation’s impact to be
revealed, and the lack of journals in some disciplines (Adams & Bullard, 2014; Brown, 2014; Neylon &
Wu, 2009). New
measures based on citation counts have emerged with varying degrees of success.
Among them are the H-Index, defined as the number of an author’s papers with
citations equal to or higher than h (Hirsch,
2005); the G-Index, which places more emphasis on highly cited papers; and
Google’s i10-Index, based on the number of a researcher’s publications that
have garnered at least 10 citations (Gutierrez, Beall, & Forero, 2015).
While the JIF and citation counts continue to be
instrumental in shaping individual researchers’ careers, these measures have
not kept pace with the explosion of the digital dissemination of scholarship (Carpenter, 2014). As
technology increasingly reshapes the research environment, more scholarly works
are being shared via such social media sites as ResearchGate, Academia.edu,
Facebook, Twitter, or even Pinterest, as well as through blogs and online
reference managers (Adie & Roe, 2013; Bar-Ilan et al., 2013).
Altmetrics track mentions of journal articles, data
sets, presentations, and other research products on social media; bookmarks and
downloads in online reference managers; mentions in popular media; data or
slide sharing sites; and other web-based forums where scholarship is being
shared (ACRL Research Planning and Review Committee, 2014;
Adams & Bullard, 2014; Bonn, 2014).
Scholarly publishers are moving into the altmetrics field. For example, Wiley,
Elsevier, and Nature have formed partnerships with Altmetrics.com, an early
aggregator of web impact indicators (Bornmann, 2014; Brigham, 2014;
Information Today Newsbytes, 2014). Elsevier recently purchased the online reference manager Mendeley,
while Plum Analytics and their research database, PlumX, were purchased by
EBSCO in 2014. As a whole, interest in and adoption of altmetrics tools seem to
be growing steadily (Roemer & Borchardt, 2013).
Excitement about the potential of altmetrics to
revolutionize research impact measurement is extensive in the LIS field. One of
these new measures’ attractive features is their speed. Traditional citations
may take years to yield measurable impact, while altmetrics can theoretically
reveal impact in weeks, days, or even minutes (Brown, 2014; Dinsmore, Allen, &
Dolby, 2014; Lapinski, Piwowar, & Priem, 2013; Piwowar & Priem, 2013). Also important is the possibility of measuring a broader scope of
materials and “products” in addition to traditional manuscripts (Bornmann, 2014, 2015; Herther, 2013;
Howard, 2015; Lapinski et al., 2013; Piwowar & Priem, 2013).
There are caveats to using altmetrics including:
author disambiguation and numbers gaming (Bornmann, 2014; Brigham, 2014; Brown,
2014; Galligan & Dyas-Correia, 2013); the sheer volume of data needed to track altmetrics through the
internet is daunting (Adie & Roe, 2013); and traditional measures like citation analysis and JIF are embedded
in the promotion and tenure process (Bazeley, Waller, & Resnis, 2014).
The
chief critique of altmetrics is the lack of empirical examination, standardization,
or regulation (Bornmann, 2014; Brigham, 2014; Carpenter, 2014; Herther, 2013; Lapinski
et al., 2013). At this stage of
development, altmetrics are more suited to complementing traditional metrics
than to supplanting them. The National Information Standards Organization
(NISO) has identified 25 key areas in which altmetrics need standardization,
including the identification of research types to be tracked, more empirical
investigation into the use of altmetrics as research impact measures, and
strategies to address potential manipulation (Carpenter,
2014; Gunn, 2014; Herther, 2013; National Information Standards Organization, 2014) .
The Emerging Role of the Academic Librarian
Research libraries are being called upon to provide
improved research support services and access to bibliometric tools. This is a
natural extension of the LIS field in which bibliometrics practice and research
has been situated. New metrics provide the opportunity for academic librarians
to practise their skills such as database navigation and analysis, familiarity
with tools such as Web of Science and Scopus, and experience with the
university promotion and tenure processes, in a new context (Åstrom &
Hansson 2012; Bladek 2014; Brigham, 2014; Brown, 2014; Gumpenberger, Wieland,
& Gorraiz, 2012; Herther 2013; Kennan, Corrall, & Afzal, 2014; MacColl
2010a; MacColl 2010b; Roemer and Borchardt 2013; Roemer & Borchardt,
2015a).
Despite the notion that bibliometrics and altmetrics
services are an excellent fit for the academic library, it may be that academic
librarians are not, as a whole, educationally prepared to provide them. Formal
training in bibliometrics is a rarity for librarians, which has led to a call
for the incorporation of bibliometrics education into the generalized LIS
curriculum. In many other countries where interest and emphasis may be strong,
formal education or training also is lacking (Bladek 2014; Kennan, Corrall, and Afzal 2014; Zhao
2011).
Current Research on Altmetrics and Academic Librarians
Roemer and Borchardt (undated) state that “Academic librarians have been
and continue to be involved with altmetrics at every level,” but quantitative
measures of that involvement are scarce in the LIS literature. In a November
2015 conference paper, Konkiel, Sutton, and Levine-Clark reported on some
aspects of a large-scale survey of U.S. academic librarians’ knowledge and use
of altmetrics. While the bulk of those data are not yet published and available
for review, preliminary results from the study indicate a low familiarity with
or use of altmetrics when compared to more traditional metrics such as the JIF
or citation counts. Respondents to the team’s survey also indicated low numbers
of reference encounters surrounding metrics of any kind. Librarians’ interest
in altmetrics and bibliometrics is not just a U.S. phenomenon; it is
international. Kennan, et al. (2014) surveyed librarians in Australia, the
U.K., Ireland, and New Zealand in 2012 and reported on their opinions
predominately on bibliometrics, with some mention of altmetrics. There is also
an unpublished research report on the use of altmetrics by Spanish librarians
and scholars (González-Fernández-Villavicencio, Dominguez-Aroca,
Calderón-Rehecho, & Garcia-Hernández, 2015). The scarcity of published
studies illustrates the need for more concrete knowledge on this topic. What do
academic librarians truly know about altmetrics?
Aims
Before best practices can be established in the area of providing research
impact services to faculty or students, it is important to ascertain the level
of awareness that academic librarians have of new and traditional metrics and
the services that provide them. Research questions for the study were:
The development of the research questions and identification of the
independent variables for the study were based on two concepts. First, simple
logic suggested that bachelor’s degree and higher granting institutions would
be more engaged with publication and research impact, and librarians there
would be more likely to be familiar with research impact measures because they
would be called upon to engage with them. It also seemed logical that reference
librarians would be more likely to explore research impact in their
interactions with faculty and students. The idea that librarians newer to the
profession would have more knowledge of the newer metrics reflects Thomas
Kuhn’s (1962) suggestion that new ideas are generally promulgated by new
generations and not by the older professionals. In other words, technological
advances have made it possible to track the impact of research outputs in new
ways, as well as boosting the ease of traditional bibliometrics. But has the
availability of new metrics generated a paradigm shift or even a revolution in
tracking research impact?
Methods
This study targeted academic librarians in the state of Oklahoma.
Publicly available information on academic library websites for all 2-year and
4+ year colleges and universities in the state (N=38) was used to gather names
and email addresses of librarians, and all but three institutions had this
information posted. In this study, “librarian” was defined as a person whose
job title was “librarian,” and was not limited to people with specific
educational backgrounds (such as an MLIS). In total, 228 librarians, 38 at
2-year institutions and 190 at 4+ year institutions, were identified and in
July 2015 were emailed a survey solicitation with a link to the survey
instrument. Two follow up emails were sent in August to generate more
responses. As an incentive for participation, the opportunity to win a $20
Amazon gift card was offered. The survey consisted of seven open-ended
questions and thirteen closed-ended questions including seven Likert-type, two
“check all that apply,” and four others. These questions were designed to
answer the research questions provided above. See appendix for the survey
instrument.
Limitations of the Study
Only librarians in Oklahoma were surveyed and these results may not be
reflective of librarians outside the state. This was a population survey, not a
sample survey, so inferential statistics (such as significance testing) were
not appropriate. Three-quarters of those surveyed did not respond and it is
unknown whether they differed on some characteristics from those who did
respond. Since survey responses were anonymous, a comparison of responders to
non-responders is not possible. With these limitations it is not appropriate to
generalize the study findings outside of the group of librarians who responded.
Additionally, the survey presented the term “altmetrics” without defining it.
Respondents may have had different definitions in mind when answering the
survey questions.
Results
Survey Response Rate
A total of 58 usable responses were received for an overall response
rate of 25.4%. By type of institution, ten librarians at 2-year institutions
responded, for a response rate of 26.3%. Responses were received from 48
librarians at 4+ year institutions resulting in a response rate of 25.3% for
that group.
Description of the Variables
There were two types of information used as dependent variables. One
type was respondents’ knowledge or opinions of various research impact measures
as indicated by their answers to survey questions. The second type of dependent
variables was scores on two items. One of these scores, Bibliometrics
Familiarity, was computed by adding up the number of bibliometrics items from
Question 3 with which respondents reported familiarity. Their score on this
item could range from zero (not familiar with any of the listed items) up to
five (familiar with all of them). The actual scores from these respondents
ranged for zero to four, with a mean score of 1.71. The other score item,
Altmetrics Familiarity, was computed by adding up the number of altmetrics
items from Question 2 with which respondents were familiar. Again, the
theoretical range was from zero to five, and for this item the actual range was
also zero to five. The mean on this score was .74. Additional dependent
variables included a number of open-ended questions that asked about faculty,
student, and librarian interest in learning about these measures and services,
outreach efforts, and current initiatives.
The study’s independent variables included type of institution: 2-year
(17.2%, 10) versus 4+-year (82.8%, 48); number of years as a librarian, from
five or fewer years (34.5%, 20) to six or more years (65.5%, 38); and primary
job responsibilities divided into reference and user services (60.3%, 35)
versus non-reference positions (39.7%, 23). The job responsibility
categorization was created from respondents’ answers to a closed-ended question
and an “other, please specify” section which allowed people to elaborate on
their job duties.
Data Analysis
The first research question was: “Are academic librarians more familiar
with established measures of research impact (bibliometrics) than with the
emerging field of altmetrics?” When asked how familiar they were “with the
concept of altmetrics to assess research impact” fewer than 10 percent (8.6%,
5) of respondents rated themselves as very familiar, and one-quarter indicated
that they were “not familiar at all” (25.9%, 15). The majority were in the
middle at “slightly familiar” (34.5%, 20) or “not very familiar” (31.0%, 18).
Respondents were then given a list of bibliometric methods and asked to
indicate the ones with which they were familiar (this was a “check all that
apply” question. See Question 3 in the appendix). Most respondents knew some of
the listed methods, with only 20.7% (f=12) indicating that they did not know
any of the ones listed. The most common familiarity was with citations counts
(74.1%, 43) and JIF (65.5%, 38). Around one-quarter (24.1%, 14) were also
familiar with the H-Index, but almost no respondents knew about the i10-Index
(5.2%, 3) or the G-Index (1.7%, 1).
Respondents were less likely to have knowledge of altmetrics tools. When
given a list of tools (Question 2), two-thirds (63.8%, 37) were not familiar
with any of the ones listed. Around one-quarter had heard of Altmetric.com
(25.9%, 15) and Mendeley (24.1%, 14). Less well-known were Impactstory (13.8%,
8) and PlumX (8.6%, 5). With the “other, please specify” option, one respondent
wrote in PLOS. See Table 1.
Table 1
Respondents’ Familiarity with Bibliometrics Methods and Altmetrics Tools
|
% and f N=58 |
Familiarity with
Bibliometrics Methods |
|
Citation Counts |
74.1% (f=43) |
Journal Impact Factor |
65.5% (f=38) |
H-Index |
24.1% (f=14) |
i10 Index |
5.2% (f=3) |
G-Index |
1.7% (f=1) |
None |
20.7% (f=12) |
|
|
Familiarity with
Altmetrics Tools |
|
Altmetric.com |
25.9% (f=15) |
Mendeley |
24.1% (f=14) |
Impactstory |
13.8% (f=8) |
PlumX |
8.6% (f=5) |
Other (PLOS) |
1.7% (f=1) |
None |
63.8% (f=37) |
Table 2
Comparison of Means Across Types of Independent Variables
|
6+ Years as Librarian Mean, N |
5 or Fewer Years as Librarian Mean, N |
Bibliometrics Familiarity |
2.03 (N=38) |
1.10 (N=20) |
Altmetrics Familiarity |
.89 (N=38) |
.45 (N=20) |
|
|
|
|
4+ Year School |
2-Year School* |
Bibliometrics Familiarity |
1.81 (N=48) |
1.20 (N=10) |
Altmetrics Familiarity |
.83 (N=48) |
.30 (N=10) |
|
|
|
|
Reference & User Services |
Non-Reference Librarians |
Bibliometrics Familiarity |
1.69 (N=35) |
1.74 (N=23) |
Altmetrics Familiarity |
.77 (N=35) |
.70 (N=23) |
*N for this category is small. View these numbers with caution.
In order to compare knowledge across different independent variables,
the data were condensed into two scales, “Bibliometrics Familiarity” and
“Altmetrics Familiarity,” as described previously. A mathematical average
(mean) score was calculated and indicated that, on average, respondents were
familiar with nearly two (1.71) bibliometric methods, and almost one (.74)
altmetrics tool. Note that the large number of respondents with no knowledge of
these items pulls the mean score down: 20.7% (12) claimed no knowledge of
bibliometrics and 63.8% (37) had heard of none of these altmetrics. When those
respondents were set aside and means calculated for those who were familiar
with one or more items, mean scores showed knowledge of slightly more than two
(2.15) bibliometrics and slightly more than two (2.15) altmetrics. In the
following analysis the mean scores include the “none” answers.
Mean scores for Bibliometrics Familiarity and Altmetrics Familiarity
were compared across independent variables. People with six or more years in
the profession were familiar with nearly twice as many bibliometrics (2.03 to
1.10) and altmetrics (.89 to .45) as those with five or fewer years of
experience. Those at 4+ year colleges and universities were more familiar with
bibliometrics (1.81 to 1.20) and altmetrics (.83 to .30) than those at 2-year
schools, but since the N for the 2-year schools is quite small, these results
should be viewed with caution. There was little difference of knowledge between
people who worked in reference and user services compared to those who held
other types of positions (1.69 to 1.74, and .77 to .70). See Table 2.
Research question 2 asked, “What are the attitudes of academic
librarians toward altmetrics versus traditional measures (bibliometrics)?” The
most widely used traditional metrics are JIF and citation counts. The H-Index,
G-Index, and i10-Index use calculations based on citation count and could be
considered less commonly used traditional metrics. The research question is
difficult to answer because few respondents had an opinion about the less
commonly used metrics. In fact, when asked about their opinions of particular
metrics most chose “I don’t know” as their answer for Hirsch’s H-Index (75.9%,
44), Google’s i10-Index (93.1%, 54), and Egghe’s G-Index (94.8%, 55). While
most respondents (67.2%, 39) also said they didn’t know about altmetrics as an
effective measure of individual research productivity, enough answered this
question for a comparison to JIF and citation counts. About two-thirds of
respondents (65.5%, 38) held the opinion that citation counts are effective for
assessing an individual investigator’s research impact. This was about twice as
many as those who thought journal impact factor (39.7%, 23) or altmetrics
(31.0%, 18) were effective. See Table 3.
Table 3
Opinions About Effectiveness of Various Measures of Research Impact
(N=58)
|
Effective |
Ineffective |
Don’t know |
Citation Counts |
65.5% (38) |
13.8% (8) |
20.7% (12) |
JIF |
39.7% (23) |
19.0% (11) |
41.4% (24) |
Altmetrics |
31.0% (18) |
1.7% (1) |
67.2% (39) |
H-Index |
20.7% (12) |
3.4% (2) |
75.9% (44) |
I-10 Index |
6.9% (4) |
0.0% (0) |
93.1% (54) |
G-Index |
1.7% (1) |
3.4% (2) |
94.8% (55) |
Respondents were asked for their open-ended comments about areas for
improvement for research impact measures and altmetrics. From the eleven
responses, respondents felt that there are problems with traditional
citation-based measures that might be able to be addressed with altmetrics. The
problems listed included the idea that highly cited authors might publish
“garbage that is later withdrawn from publication” but the article received a
high citation count. Uncited literature has “an important role in the body of
research as a whole” but that isn’t recognized through citation counts.
Traditional resources like Web of Science are difficult to use and altmetrics
are easier and more current ways of measuring impact. Problems that respondents
recognized with the use of altmetrics were that there is a lack of standards
for what these metrics mean. A saved article doesn’t mean it was “used in a
meaningful way.” These metrics should not be over-valued “because they can be
manipulated,” and they “should be used only with extreme caution.” Several
respondents also recognized a need for more awareness among faculty and
librarians, and that workshops would be ideal.
Research question 3 asked, “Are academic librarians being called upon by
faculty to provide information about new research impact measures, and if so,
what has characterized these interactions?” Very few librarians reported that
faculty or students requested information about altmetrics. In fact, 89.7% (52)
had zero such requests. A small number (8.6%, 5) received one to five
information requests, and one respondent (1.7%) reported six to ten requests.
This survey question was accompanied by an open-ended option. Examples given of
requests included: citation analysis of professor’s publications, mentions of a
faculty member in popular media, usage counts (full text, citations, data,
etc.), and help in determining the impact factor of an obscure journal.
The final research question was, “What are academic libraries doing, or
what could they be doing regarding altmetrics?” This was addressed with a
number of open ended questions about a variety of types of outreach,
initiatives, and training.
Respondents were asked what types of outreach were currently being
offered at their institutions on altmetrics and traditional research impact
measures. Four librarians reported covering altmetrics in campus workshops,
sessions, and discussions at faculty orientation. Two had written LibGuides on
altmetrics. One library was in the process of marketing their altmetrics
information and another expected to start using altmetrics in a new Digital
Commons. Three didn’t know of any efforts on their campuses. Concerning
traditional metrics outreach, five librarians mentioned offering citation
counts, and three JIF. There were three comments about individual consultations
and five on workshops that their libraries offer. One stated LibGuides and two
said they didn’t know. When asked what types of outreach they believed could
help faculty and students learn about research impact measures and altmetrics,
eleven respondents mentioned various sorts of trainings and opportunities
including classes, workshops, webinars and online videos, and conferences or
seminars. Two thought the best way would be to start demonstrating what is
available and what works. Two others said that faculty and students on their
campuses were not interested in learning new things. Respondents were mostly
not aware of initiatives underway at their institutions to capture altmetrics
data.
The librarians in the study were very interested in learning more about
altmetrics. In fact, 84.2% (48) said that they would attend if a free workshop
was offered at their institution. When asked what tools could help them learn
more about research impact measures and altmetrics, ten mentioned some type of
training, although several were careful to point out that it should come from
knowledgeable sources such as ACRL, Digital Commons, or vendors. Specific
training types mentioned included sessions, workshops, webinars and online
videos, guides, descriptions, and outlines. One respondent stated they could
learn on their own using Google, and two made general comments that they were
interested in learning more.
Discussion
This study has a few measures that can be compared to findings from
other studies, and it has a number of unique findings to report. First and most
importantly, the data revealed that there appears to be a dearth of knowledge
among academic librarians in Oklahoma about altmetrics tools, and most of the
librarians who responded to the survey were not familiar with newer forms of
bibliometrics (H-Index, etc.). Among respondents, librarians who had been in
the profession over five years were more familiar with both altmetrics and
bibliometrics. Citation counts and journal impact factor have been used for
many years, and it is unsurprising that librarians are likely to be familiar
with their advantages and limitations. However, other measures such as the more
recent bibliometric calculations (i10 Index, G-Index, and H-Index) and
altmetrics are newer and generally less well-known.
Konkiel et al. (2015) found that at universities with the highest
Carnegie classification level, 30% of most academic librarians and 50% of
“scholarly communication support” librarians reported having “very expert
familiarity” with altmetrics, while in the current study, only 8.6% claimed
that they were “very familiar.” It is not surprising that librarians at
research-intensive universities would have more familiarity than librarians
from a mixture of college and university levels. The current study seems to
bear out findings in the literature that for many library professionals, more
education regarding bibliometrics and altmetrics is needed. While Konkiel et
al. found that a very specific job category of librarians had more familiarity
with altmetrics, the current study compared reference and user services
librarians with non-reference librarians and found no meaningful difference in
knowledge between job types.
Despite calls in the literature for librarians to be involved with
providing altmetrics for the scholars on their campuses, most respondents
reported that they had received no such requests from faculty. A small number
of the librarians in the study had taught about altmetrics in various workshops
and discussions, and some had written LibGuides or used other marketing. A
handful of librarians offered standard bibliometrics services individually or
in workshops. While the current study confirms the literature’s reflection of
librarians’ interest in bibliometrics and altmetrics, respondents’ experiences
in this study reveal little evidence of research support requests from other
members of their institutions. This is in keeping with preliminary findings
from Konkiel et al. (2015).
Most respondents wanted to learn more about altmetrics and expressed
interest in various types of training. Two international studies have also
reported that academic librarians in a variety of countries are interested in
altmetrics and bibliometrics training by their institutions, through vendors,
or for LIS schools to add this to their curricula
(González-Fernández-Villavicencio, et al., 2015; Kennan, et al., 2014). While Roemer and Borchardt (2015b) advocate
that early career librarians should be providers of altmetrics in their jobs,
this study found that early career librarians were less likely to be familiar
with research metrics than their more experienced counterparts. This finding
did not support Kuhn’s (1962) observation that new discoveries are championed
by newer professionals rather than those already established in their careers.
Additionally, while altmetrics may have the potential for generating a paradigm
shift in the way research impact is measured in the LIS profession, this small
study suggests that it remains to be seen whether these newer metrics will
revolutionize LIS practice in this area in any wide-spread manner.
Conclusions
Technological advances are pushing the dissemination of research into
new venues and traditional bibliometrics are not capturing the impact of these
new practices. Altmetrics offer new ways to measure research impact. The
literature advocates for librarians to take up these cutting-edge technologies.
However, there is little hard data in the literature showing if and how
librarians are using altmetrics. Future studies that produce concrete evidence
would be valuable to the profession. One potential direction for future
research might include in-depth interviews with librarians who are using
altmetrics in their jobs to discover how they are using them and for what
purposes. Information-seeking behaviour studies of both scholars and librarians
who are searching for information on research impact could be valuable. It
might also be useful to explore the opinions of tenured faculty who make
decisions on tenure review boards as to whether or not they positively view
alternative metrics. There is much work to be done before the potential for and
use of altmetrics in academia is well-understood.
References
ACRL
Research Planning and Review Committee. (2014). Top trends in academic libraries:
A review of the trends and issues affecting
academic libraries in higher education. College & Research Libraries News, June
2014, 294–302. Retrieved from http://crln.acrl.org/
Adams, T.
M., & Bullard, K. A. (2014). A case study of librarian outreach to
scientists: Collaborative research and scholarly communication in conservation
biology. College & Undergraduate
Libraries, 21(3-4), 377–395. http://dx.doi.org/10.1080/10691316.2014.925415
Adie, E.,
& Roe, W. (2013). Altmetric: Enriching scholarly content with article-level
discussion and metrics. Learned
Publishing, 26(1), 11–17. http://dx.doi.org/10.1087/20130103
Åstrom, F.,
& Hansson, J. (2012). How implementation of bibliometric practice affects
the role of academic libraries. Journal
of Librarianship and Information Science, 45(4), 316–322. http://dx.doi.org/10.1177/0961000612456867
Bar-Ilan,
J., Sugimoto, C., Gunn, W., Haustein, S., Konkiel, S., Larivière, V., &
Lin, J. (2013, Nov.). Altmetrics: Present and future – Panel. Proceedings of the Association for
Information Science and Technology. 50(1). http://dx.doi.org/10.1002/meet.14505001013
Bazeley, J.
W., Waller, J., & Resnis, E. (2014). Engaging faculty in scholarly
communication change : A
learning community approach. Journal of
Librarianship and Scholarly Communication, 2(3), ep1129. http://dx.doi.org/10.7710/2162-3309.1129
Belter, C.
W. (2015). Bibliometric indicators: opportunities and limits. Journal of the Medical Library Association,
103(4), 219–221. http://dx.doi.org/10.3163/1536-5050.103.4.014
Bladek, M.
(2014). Bibliometrics services and the academic library: Meeting the emerging
needs of the campus community. College
& Undergraduate Libraries, 21(3-4), 330–344. http://dx.doi.org/10.1080/10691316.2014.929066
Bollen, J.,
Van de Sompel, H., Smith, J. A., & Luce, R. (2005). Toward alternative
metrics of journal impact: A comparison of download and citation data. Information Processing & Management, 41(6),
1419–1440. http://dx.doi.org/10.1016/j.ipm.2005.03.024
Bonn, M.
(2014). Tooling up: Scholarly
communication education and training. College & Research Library News,
March 2014, 132–135. Retrieved from http://crln.acrl.org/
Bornmann,
L. (2014). Do altmetrics point to the broader impact of research? An overview
of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903. http://dx.doi.org/10.1016/j.joi.2014.09.005
Bornmann,
L. (2015). Usefulness of altmetrics for measuring the broader impact of
research: A case study using data from PLOS and F1000Prime. Aslib Journal of Information Management,
67(3), 305–319. http://dx.doi.org/10.1108/AJIM-09-2014-0115
Brigham, T.
J. (2014). An introduction to altmetrics. Medical
Reference Services Quarterly, 33(4), 438–447. http://dx.doi.org/10.1080/02763869.2014.957093
Brown, J.
D. (2014). Citation searching for tenure and promotion: an overview of issues
and tools. Reference Services Review, 42(1),
70–89. http://dx.doi.org/10.1108/RSR-05-2013-0023
Carpenter,
T. A. (2014). Comparing digital apples to digital apples: Background on NISO’s
effort to build an infrastructure for new forms of scholarly assessment. Information Services and Use, 34(1-2),
103–106. http://dx.doi.org/10.3233/ISU-140739
Cooper, I.
D. (2015). Bibliometrics basics. Journal
of the Medical Library Association, 103(4), 217–218. http://dx.doi.org/10.3163/1536-5050.103.4.013
Dinsmore,
A., Allen, L., & Dolby, K. (2014). Alternative perspectives on impact: The
potential of ALMs and altmetrics to inform funders about research impact. PLoS Biology, 12(11), e1002003. http://dx.doi.org/10.1371/journal.pbio.1002003
Galligan,
F., & Dyas-Correia, S. (2013). Altmetrics: Rethinking the way we measure. Serials Review, 39(1), 56–61. http://dx.doi.org/10.1016/j.serrev.2013.01.003
González-Fernández-Villavicencio,
N., Dominguez-Aroca, M., Calderón-Rehecho, A., & Garcia-Hernández, P.
(2015). What role do librarians play in altmetrics? Retrieved from http://eprints.rclis.org/25481/1/Altmetrics.pdf
Gumpenberger,
C., Wieland, M., & Gorraiz, J. (2012). Bibliometric practices and
activities at the University of Vienna. Library
Management, 33(3), 174–183. http://dx.doi.org/10.1108/01435121211217199
Gunn, W.
(2014). Mendeley: Enabling and understanding scientific collaboration. Information Services and Use, 34(1-2),
99–102. http://dx.doi.org/10.3233/isu-140738
Gutierrez,
F. R. S., Beall, J., & Forero, D. A. (2015). Spurious alternative impact
factors: The scale of the problem from an academic perspective. BioEssays, 37(5), 474–476. http://dx.doi.org/10.1002/bies.201500011
Herther, N.
K. (2013). NISO project brings scientific evaluation into the 21st century with
altmetrics. Information Today, Retrieved
from http://newsbreaks.infotoday.com/NewsBreaks/NISO-Project-Brings-Scientific-Evaluation-Into-the-21st-Century-With-Altmetrics-90409.asp
Hirsch, J.
E. (2005). An index to quantify an individual's scientific research output. Proceedings
of the National Academy of Sciences of the United States of America, 102(46),
16569-16572.
Howard, J.
(2013). New metrics providers help keep libraries in the research-tracking
game. Chronicle of Higher Education 59(38)
A6-A7.
Information
Today Newsbytes. (2014). Information
Today, 31(7), 3.
Kennan, M.
A., Corrall, S., & Afzal, W. (2014). “Making space” in practice and
education: Research support services in academic libraries. Library Management, 35(8/9), 666–683. http://dx.doi.org/10.1108/LM-03-2014-0037
Konkiel,
S., Sutton, S., & Levine-Clark, M. (2015, November). Use of altmetrics in
U.S.-based academic libraries. Presentation at The Altmetrics Conference
(2:AM). Amsterdam, The Netherlands. Retrieved from http://altmetricsconference.com/?p=68
Kuhn,
Thomas S. (1962). The Structure of
Scientific Revolutions. Chicago, IL: University of Chicago Press.
Lapinski,
S., Piwowar, H., & Priem, J. (2013). Riding the crest of the altmetrics
wave: How librarians can help prepare faculty for the
next generation of research impact metrics. College & Research Libraries News, 74(6),
292–300. Retrieved from http://crln.acrl.org/content/74/6/292.full
MacColl, J.
(2010a). Library roles in university
research assessment. LIBER Quarterly, 20(2), 152–168. http://dx.doi.org/10.18352/lq.7984
MacColl, J.
(2010b). Research assessment and the role of the library. Report produced by
OCLC Research. Retrieved from http://www.oclc.org/research/publications/library/2010/2010-01.pdf
National
Information Standards Organization. (2014). NISO
Altmetrics Standards Project White Paper. Retrieved July 20, 2016, from http://www.niso.org/apps/group_public/document.php?document_id=13295
Neylon, C.,
& Wu, S. (2009). Article-level metrics and the evolution of scientific
impact. PLoS Biology, 7(11),
e1000242. http://dx.doi.org/10.1371/journal.pbio.1000242
Piwowar,
H., & Priem, J. (2013). The power of altmetrics on a CV. Bulletin of the American Society for
Information Science and Technology, 39(4), 10–13. http://dx.doi.org/10.1002/bult.2013.1720390405
Roemer, R.
C., & Borchardt, R. (2013). Institutional altmetrics and academic
libraries. Information Standards
Quarterly, 25(2), 14–19. http://dx.doi.org/10.3789/isqv25no2.2013.03
Roemer, R.
C., & Borchardt, R. (2015a). Altmetrics and the role of librarians. Library Technology Reports, 51(5), 31–37. http://dx.doi.org/10.5860/ltr.51n5
Roemer, R.
C., & Borchardt, R. (2015b). New grads, meet new metrics: Why early career
librarians should care about altmetrics & research impact. In the Library
with the Lead Pipe (blog). Retrieved April 18, 2016 from http://www.inthelibrarywiththeleadpipe.org/2015/new-grads-meet-new-metrics-why-early-career-librarians-should-care-about-altmetrics-research-impact/
Roemer, R.
C., & Borchardt, R. (Undated). Keeping up with … altmetrics. Retrieved from
http://www.ala.org/acrl/publications/keeping_up_with/altmetrics
Zhao, D.
(2011). Bibliometrics and LIS education: How do they fit together? Proceedings of the American Society for
Information Science and Technology, 48, 1–4. http://dx.doi.org/10.1002/meet.2011.14504801190
APPENDIX: Survey Questions
Q1: How familiar are you with the concept of altmetrics to assess
research impact?
1 = very familiar, 2 = slightly familiar, 3 = not very familiar, 4 = not
familiar at all
Q2: With which of the following altmetrics-related services are you
familiar? Please check all that apply.
q2a: Altmetric
q2b: Impactstory
q2c: Mendeley
q2d: PlumX
q2e: I have never
heard of any of these services
q2f: other, please
specify
Q3: With which of the following methods of assessing research impact are
you familiar? Please check all that apply.
q3a: journal impact
factor
q3b: citation
counts
q3c: h-index
q3d: g-index
q3e: i-10 index
q3f: none
q3g: other, please
specify
Q4: In your opinion, how effective is the journal impact factor in
assessing an individual investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q5: In your opinion, how effective are citation counts in assessing an
individual investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q6: In your opinion, how effective is the H-Index in assessing an individual
investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q7: In your opinion, how effective is the G-Index in assessing an
individual investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q8: In your opinion, how effective is the i-10 Index in assessing an
individual investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q9: During the past year, how many information requests regarding
altmetrics have you received from a faculty member or student at your
institution?
1= 0, 2 = 1-5, 3 = 6-10, 4 = more than 10
Q10: If you have received an information request from a faculty member
or student at your institution regarding altmetrics in the past year, please
describe some of these interactions and the specific information requested. If
not applicable, please select n/a.
Q11: In your opinion, how effective are altmetrics in assessing an
individual investigator’s research impact?
1 = very ineffective
2 = ineffective
3 = somewhat ineffective
4 = I don’t know
5 = somewhat effective
6 = effective
7 = very effective
Q12: What, if any, types of outreach does your library offer regarding
altmetrics?
Q13: What, if any, types of outreach does your library offer regarding
traditional research impact measures? Examples could include the journal impact
factor, citation counts, the H-Index, the G-Index, or the i-10 Index.
Q14: If a free workshop was offered at your institution regarding
research impact assessment and altmetrics, would you attend?
1= yes, 2 = no
Q15: Please describe any initiatives underway at your institution to
collect altmetrics data.
Q16: In your opinion, in what areas do research impact measures and
altmetrics need improvement?
Q17: What tools could help you learn more about research impact measures
and altmetrics?
Q18: What types of outreach do you believe could help faculty and
students learn more about research impact measures and altmetrics?
Q19: How many years have you served as a librarian?
1 = <1 year, 2 = 1-5 years, 3 = 6-10 years, 4 = more than 10 years
Q20: In what department do you primarily operate as a librarian?
1= reference, 2 = ILL, 3 = serials, 4 = systems, 5 = special
collections, 6 = other, please specify