Commentary
The
Evolution of Evidence Based Library and Information Practice, Part I: Defining
EBLIP
Jonathan D. Eldredge
Associate Professor
Health Sciences Library and
Informatics Center
University of New Mexico
Albuquerque, New Mexico, United
States
Email: jeldredge@salud.unm.edu
Received:
1 Dec. 2012 Accepted: 4 Dec. 2012
2012 Eldredge.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 2.5
Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not
used for commercial purposes, and, if transformed, the resulting work is
redistributed under the same or similar license to this one.
Evidence Based Library and
Information Practice (EBLIP) has achieved an
impressive array of accomplishments during its brief lifespan. Mysteriously,
the recent 15th Anniversary of EBLIP passed with little notice. If
the past Editor actually had not brought it to my attention, I might not have
noticed this anniversary despite having had a direct role in its development (Eldredge, 1997). EBLIP already has produced six international
conferences, the establishment of this open access peer reviewed journal,
continuing education courses based in the UK and US with broad international
participation, representation in most types of libraries (academic, public,
etc.), two special issues of peer reviewed journals, and two books (Booth &
Brice, 2004; Connor, 2007) devoted entirely to EBLIP. Some of
the most robust early EBLIP work originated in countries such as Australia,
Canada, Sweden, the UK, and the US. More recent EBLIP work has emerged
from countries such as Iran and Japan (Gavgani, 2009;
Yukiko, 2008). Few phenomena in the Library and Information Science (LIS) world
indeed can claim as many achievements within only 15 years.
EBLIP provides a sequential,
structured process for integrating the best available evidence into making
important decisions. The practitioner applies this decision making process by
using the best available evidence while informed by a pragmatic perspective
developed from working in the field, critical thinking skills, and an awareness
of different research designs, which is further modulated by knowledge of the
affected user population’s values or preferences.
EBLIP has evolved quickly during the
past 15 years. It has managed this rapid pace due to a professional environment
characterized by encouragement, inquiry, skepticism, dialogue, an openness to new information among participants, and a
willingness on the part of LIS professionals to change their own minds. The
brief definition above retains my original ideas, while reflecting the further
evolution of my thinking during the past 15 years (Eldredge,
2000; Eldredge, 2006; Eldredge,
2008), as well as incorporating elements from others’ definitions within this
dynamic EBLIP professional environment (Booth, 2002 ; Crumley
& Koufogiannakis, 2002).
The EBLIP
Process
The EBLIP process provides structure
for reaching important decisions. The EBLIP process resembles evidence based
processes in other professions such as education, health, management, or public
policy analysis. The steps in the EBLIP process can be summarized as:
1.
Formulate an answerable question on
an important issue
2.
Search for the best available
evidence to answer the question
3.
Critically appraise the evidence
4.
Make a decision and apply it
5.
Evaluate one’s performance
In the pages that follow I will use
the structure and sequence of the EBLIP process to define more clearly and
describe the five EBLIP steps.
1.
Formulate an Answerable Question
Davies (2011) has noted that
“Questions are the driving force behind evidence based practice (EBP).” Early
explorations with EBLIP questions focused upon formulation techniques and
compiling lists of questions from colleagues around the world (Booth, 2001; Eldredge, 2001). Lewis & Cotter (2007) noted the
relative stability of EBLIP questions by subject matter between 2001 and 2006,
although their study pointed to persistent research-practice gaps. Booth (2006)
crystalized existing ideas about question formulation and provided perhaps the
most pragmatic advice to date on how to formulate productive EBLIP questions.
Wilson (2009) recently provided a brief column that included key pointers on
question formulation.
A new development in 2008 marked an
unexpected and significant turning point in EBLIP question formulation.
Completely separate teams in Sweden and the US with no awareness of the other
team, simultaneously conducted consensus building Delphi studies to prioritize
large numbers of EBLIP questions. Interestingly, and reflecting similar
synchronicity the same year, Rossall, Boyes, Montacute, and Doherty
(2008) called for similar approaches via research networks in an effort
unrelated to either the Swedish or US projects.
The Swedish team led by Maceviciute and Wilson (2009) conducted a two round Delphi
study that surveyed librarians via email from different types of libraries
(academic, public, etc.) about their top research concerns. The final phase of
the project involved a face-to-face workshop that consisted of leveraging
nominal group exercises in an effort to reach consensus.
The Delphi study in the US during
2008 focused only upon the leaders and research oriented members of the Medical
Library Association (MLA). MLA leaders and the members of MLA’s Research
Section were queried through a two-phase series of surveys and voting on top
ranked questions. This study produced 12 top priority research questions that
became the MLA Research Agenda. A slightly modified 2011 Delphi study by the
same team produced a list of top ranked research questions quite different from
the 2008 study (Eldredge, Harris, & Ascher, 2009; Eldredge, Ascher, Holmes, & Harris, 2012). Harris, Holmes, Ascher, and Eldredge (2012)
conducted a subject analysis on the complete list of 140 questions submitted
during the first phase of the 2011 Delphi study. The short lists of high
priority research questions generated by both the Swedish and U.S. Delphi
studies allow the profession to target high priority research concerns with
money and other resources. Library administrators can use these short listed
priority research questions to encourage their librarians to pursue studies in
these important research areas. Our profession will benefit most from devoting
the greatest resources and incentives to answering the highest ranked EBLIP
questions, although we should continue to encourage individual researchers to
pursue alternative areas of applied research which are capable of improving our
practice.
2.
Searching for the Evidence
Members of our profession are viewed
widely by others as the masters of organizing and searching for needed
information. Paradoxically, our own databases are poorly organized and our vast
grey literature unsuitable for easy systematic inquiry. Searching for the
relevant evidence on the part of EBLIP practitioners, thus, poses considerable
challenges. Winning (2004) and Beverley (2004) assessed the challenges and
offered their practical solutions for finding the needed evidence. While the
technical details of these approaches might need updating, the principles these
authors offered for searching for EBLIP largely still work well. Booth (2008)
wrote a complementary column on tactics for searching that might further aid
EBLIP practitioners. Bradley (2007) has noted that we also often need to search
other non-LIS literatures to find potentially pertinent evidence. Since many
LIS researchers receive incentives in the form of paid conference attendance
when presenting papers or posters, but zero or even negative incentives to
publish the same research results, our profession consequently deposits much of
its intellectual capital in the extensive and not always easily accessed grey
literature.
The “Evidence Summaries” in this
journal provide great assistance to busy practitioners who can neither search
for the evidence nor appraise it critically. The Evidence Summaries represented
a brilliant idea (Koufogiannakis, 2006) that, like
most of EBLIP, has used the best available evidence to foster improvement as it
has evolved continuously (Kloda, Koufogiannakis,
& Mallan, 2011; Kloda,
2012). The open access journal Hypothesis supplements the evidence
summaries in EBLIP through its regularly published literature reviews on
recent research and its publication of expanded structured abstracts for the
Research Awards granted annually by the Medical Library Association.
3.
Critically Appraise the Evidence
Critical appraisal involves sifting
through the best available evidence in order to make a sound decision. Two core
principles guide the critical appraisal process. First, the evidence must be
appropriate for answering the specific EBLIP question. Second, evidence can
vary widely in its quality.
Making judgments on the
appropriateness of evidence can be a challenge. Ideally, every graduate of a
library or information practice professional school would be equipped to meet
this challenge with at least one semester length course on the strengths and
weaknesses of the major qualitative and quantitative research methods. The
absence of such coursework in the majority of graduates has led to a new role
for librarians with such research methods training: “Translator.” Previously, I
conceptualized the librarian roles of evidence “Producers” and “Consumers” (Eldredge, 2008, p. 254), but the realities of EBLIP have
led to a new third “Translator” role. The aforementioned Evidence Summaries in
this journal enlist the services of such translators to critically appraise
research evidence, which includes gauging evidence appropriateness.
The profession now has a far more
robust evidence base than it did during the early years of EBLIP (Dalrymple, 2010). Systematic reviews are generally
considered to be the highest form of evidence, regardless of EBLIP question
type (Eldredge, 2008). Systematic reviews were once
scarce in our profession. They are now far more common with 39 documented in a
blog created by Denise Koufogiannakis (2012). We have
seen a proliferation of rigorous quantitative and qualitative research studies
in our literature (Given, 2006), although much more research needs to be
pursued (Koufogiannakis & Crumley,
2006; Rossall et al., 2008). As Koufogiannakis
(2011) recently concluded, “The scientific aspect of our work continually needs
to be reinforced and built upon” (p. 2). The same principle applies to
developing our local sources of evidence.
Many times evidence summaries do not
quite address an emerging EBLIP question so librarians must access the research
literature. Fortunately, there have been a number of critical appraisal
checklists to guide their reviews of the research literature (Booth &
Brice, 2003; Glynn, 2006). Evidence hierarchies are helpful tools to guide
critical appraisal, but should not be utilized rigidly (Eldredge,
2002; Eldredge, 2008). Brettle
(2012) recently reminded us that additionally we need to be open to the
possible utility of evidence that defies our present EBLIP categorizations.
4. Make a
Decision and Apply It
EBLIP questions emerge within local
contexts when the practitioner must make an important decision. Some questions
are more universally shared, as already noted in the discussion of Delphi
studies above. Making decisions and applying them similarly occur in the
specific local context. EBLIP practitioners must know their local users’ values
and preferences in this fourth step to be successful. Cognitive biases also
present some of the most daunting challenges in making and applying a sound
decision, regardless of local context. Cognitive biases either interfere with
our perception of situations or in our making decisions (Eldredge,
2007). The decision making step in EBLIP includes so many potential pitfalls
that it would require an entire commentary to begin to examine even the most
fundamental issues. Suffice it to state that some of the best minds in EBLIP
have been grappling with these issues for years, and much more work remains.
5.
Evaluate One’s Performance
Grant’s systematic review on
reflective practice (2007) illustrates that since 1978 our profession has a
scattered yet evolving history of incorporating self-evaluation into practice.
She tracks a trend toward more sophisticated forms of reflection than the early
forms that comprised mainly senior librarians’ reminiscences. Others have noted
that evaluating performance takes place at the individual, institution, and
professional association levels, so it manifests itself in more than just a
solitary form.
Conclusion
and Next Steps
Anniversaries offer us a convenient
chance to track progress, reassess, and reflect. EBLIP actually did not begin
in 1997, only an articulation and initial definition that expanded into an
impressive body of further work. Much work preceded the beginning of EBLIP,
too, although the origins are diffuse when viewed through the retrospective
lens of much undocumented history (Eldredge, 2004;
Russell, 2008).
This commentary has described what,
for many EBLIP readers, will be obvious. EBLIP has arrived on the LIS scene,
has been fairly well codified within an environment of skepticism and
reflection over the past 15 years, and it certainly seems to be on the minds of
many librarians and other information professionals these days.
The New Oxford American Dictionary
defines a “definition” as “An exact statement or description of the nature,
scope, or meaning of something” (Definition, 2010). This commentary addresses
the nature and scope of EBLIP. Aside from mentioning its role in decision
making, however, this piece does not delve into the purpose(s) or meaning
of EBLIP. Part II (in the March 2013 issue) will grapple with the function(s)
that EBLIP serves within our profession. Is EBLIP a social movement within our
profession? A reformist movement? A
new academic discipline? A paradigm shift? A diffusing innovation? The tentative answers to this
functionalist or structural-functionist question in
Part II might shed light on where we need to be heading over the next 15 years
and how we might best get there.
References
Beverley, C. (2004). Searching the
library and information science literature. In A. Booth & A. Brice
(Eds.), Evidence based practice for information professionals: A
handbook. (pp.89-103). London:
Facet. Retrieved 6 Dec. 2012 from http://ebliptext.pbworks.com/f/Booth+%26+Brice+2004+EBP+for+Info+Professionals+-+A+Handbook.pdf
Booth, A. (2001). Turning research
priorities into answerable questions. Health Information and
Libraries Journal, 18(2), 130-132. doi:10.1046/j.1471-1842.2001.d01-3.x
Booth, A. (2002). From EBM to EBL: Two steps forward or
one step back? Medical Reference Services Quarterly, 21(3), 51-64. doi:10.1300/J115v21n03_04
Booth, A. (2006). Clear and present questions: Formulating
questions for evidence based practice. Library Hi Tech, 24(3), 355-368. doi:10.1108/07378830610692118
Booth, A. (2008). Unpacking your
literature search toolbox: On search styles and tactics. Health
Information and Libraries Journal, 25(4), 313-317. doi:0.1111/j.1471-1842.2008.00825.x
Booth, A., & Brice, A. (2003). Clear-cut?:
Facilitating health librarians to use information research in practice. Health
Information and Libraries Journal, 20(Suppl. 1), 45-52.
doi:10.1046/j.1365-2532.20.s1.10.x
Booth, A., & Brice, A. (Eds.)
(2004). Evidence based practice for
information professionals: A handbook. (pp.89-103). London:
Facet. Retrieved 6 Dec. 2012 from http://ebliptext.pbworks.com/f/Booth+%26+Brice+2004+EBP+for+Info+Professionals+-+A+Handbook.pdf
Bradley, C. (2007). The E in EBL: Finding the evidence to
support your practice. Feliciter, 53(1),
22-24.
Brettle, A. (2012). Learning
from others about research evidence. Evidence Based Library and
Information Practice, 7(2), 1-3.
Connor, E. (Ed.) (2007). Evidence-based librarianship:
case studies and active learning exercises. Oxford: Chandos.
Crumley, E., & Koufogiannakis,
D. (2002). Developing evidence-based librarianship: Practical steps for
implementation. Health Information and Libraries Journal, 19(2),
61-70. doi:10.1046/j.1471-1842.2002.00372.x
Dalrymple, P. W. (2010). Applying
evidence in practice: What we can learn from healthcare. Evidence
Based Library and Information Practice, 5(1), 43-47.
Davies, K. S. (2011). Formulating the evidence based practice
question: A review of the frameworks. Evidence Based Library and Information
Practice, 6(2), 75-80.
Definition. (2010). In A.
Stevenson & C. A. Lindberg (Eds.), New Oxford American Dictionary.
(3rd ed., p. 455). New York, NY: Oxford University Press.
Eldredge, J. (1997). Evidence-based
librarianship: A commentary for Hypothesis. Hypothesis, 11(3), 4-7.
Retrieved 6 Dec. 2012 from http://research.mlanet.org/hypothesis/hypo11-3.pdf
Eldredge, J. (2001). The
most relevant and answerable research questions facing the practice of health
sciences librarianship. Hypothesis, 15(1), 9-16. Retrieved 6 Dec.
2012 from http://research.mlanet.org/hypothesis/Hypo2001v.15%20no.1.pdf
Eldredge, J. (2002). Evidence-based
librarianship: Levels of evidence. Hypothesis, 16(3): 10-13. Retrieved 6
Dec. 2012 from http://research.mlanet.org/hypothesis/hyp_v16n3.pdf
Eldredge, J. (2004). Evidence-based
information practice: A prehistory. In A. Booth & A. Brice (Eds.), Evidence
based practice for information professionals: A handbook. (pp.24-35). London:
Facet. Retrieved 6 Dec. 2012 from http://ebliptext.pbworks.com/f/Booth+%26+Brice+2004+EBP+for+Info+Professionals+-+A+Handbook.pdf
Eldredge, J. (2006). Evidence-based
librarianship: The EBL process. Library Hi Tech, 24(3), 341-354. doi:10.1108/07378830610692118
Eldredge, J. (2007). Cognitive biases as
obstacles to effective decision making. Paper presented at the meeting of
EBLIP4: Transforming the Profession: 4th International Evidence Based Library
and Information Practice Conference, University of North Carolina-Chapel Hill,
Durham, NC, USA. Retrieved 6 Dec. 2012 from http://www.eblip4.unc.edu/downloads/eblip4_final_revised_postconference.pdf
Eldredge, J. D. (2000). Evidence-based
librarianship: An overview. Bulletin of the Medical Library Association, 88(4),
289-302. Retrieved 6 Dec. from http://www.ncbi.nlm.nih.gov/pmc/articles/PMC35250/pdf/i0025-7338-088-04-0289.pdf
Eldredge, J. D. (2008). Evidence-based
practice. In M. S. Wood (Ed.), Introduction to
Health Sciences Librarianship. (pp. 241-269). Binghamton, NY:
Haworth.
Eldredge, J. D., Ascher,
M. T., Holmes, H. N., & Harris, M. R. (2012). The new Medical Library Association
research agenda: Final results from a three-phase Delphi study. Journal of
the Medical Library Association, 100(3), 214-218. doi:10.3163/1536-5050.100.3.012
Eldredge, J. D., Harris, M. R., & Ascher, M. T. (2009). Defining the Medical Library
Association research agenda: Methodology and final results from a consensus
process. Journal of the Medical Library Association, 97(3),
178-185. doi:10.3163/1536-5050.97.3.006
Gavgani, V. Z. (2009). The
perception and practice of evidence based library and information practice
among Iranian medical librarians. Evidence Based Library and
Information Practice, 4(4), 37-57.
Given, L. (2006). Qualitative research in
evidence-based practice: A valuable partnership. Library Hi Tech, 24(3),
376-386. doi:10.1108/07378830610692145
Glynn, L. (2006). A critical appraisal
tool for library and information research. Library Hi Tech, 24(3),
387-399. doi:10.1108/07378830610692154
Grant, M. J. (2007). The role of reflection in the library
and information sector: A systematic review. Health Information and
Libraries Journal, 24(3), 155-166. doi:10.1111/j.1471-1842.2007.00731.x
Harris, M. R., Holmes, H. N., Ascher,
M. T., & Eldredge, J. D. (in press). Inventory of research questions identified by the 2011 MLA Research
Agenda Delphi study. Hypothesis.
Kloda, L. (2012). Improvements to evidence summaries: An
evidence based approach. Evidence Based Library and Information Practice, 7(3),
71-72.
Kloda, L. A., Koufogiannakis,
D., & Mallan, K. (2011). Transferring evidence into practice:
What evidence summaries of library and information studies research tell practitioners. Information Research, 16(1),
Paper 465. Retrieved 6 Dec. 2012 from http://informationr.net/ir/16-1/paper465.html
Koufogiannakis, D. (2006). Small steps forward
through critical appraisal. Evidence Based Library and Information Practice,
1(1), 81-82.
Koufogiannakis, D. (2011). Should we value
knowledge and expertise? Evidence Based Library and Information Practice, 6(3),
1-2.
Koufogiannakis, D. (2012). LIS
systematic reviews. Retrieved 6 Dec. 2012 from http://lis-systematic-reviews.wikispaces.com/Welcome
Koufogiannakis, D., & Crumley,
E. (2006). Research
in librarianship: Issues to consider. Library Hi Tech, 24(3): 324-340. doi:10.1108/07378830610692109
Lewis, S., & Cotter, L. (2007). Have the most relevant and
answerable research questions facing librarians changed between 2001 and 2006? Evidence
Based Library and Information Practice, 2(1), 107-120.
Maceviciute, E., & Wilson, T. D. (2009). A Delphi investigation into the
research needs in Swedish librarianship. Information Research, 14(4),
Paper 419. Retrieved 6 Dec. 2012 from http://informationr.net/ir/14-4/paper419.html
Rossall, H., Boyes,
C., Montacute, K., & Doherty, P. (2008). Developing
research capacity in health librarians: A review of the evidence. Health
Information and Libraries Journal, 25(3), 159-174.
doi:10.1111/j.1471-1842.2008.00788.x
Russell, K. (2008). Evidence-based
practice and organizational development in libraries. Library Trends,
56(4), 910-930.
Wilson, V. (2009). Matching question types to study
designs. Evidence Based Library and Information Practice, 4(1), 51-52.
Winning, A. (2004). Identifying
sources of evidence. In A. Booth & A. Brice (Eds.), Evidence
based practice for information professionals: A handbook. (pp.71-88). London:
Facet. Retrieved 6 Dec. 2012 from http://ebliptext.pbworks.com/f/Booth+%26+Brice+2004+EBP+for+Info+Professionals+-+A+Handbook.pdf
Yukiko, S. (2008). From EBM to EBL /
EBLIP. Pt.2: Evidence-based practice for their own practice by medical
librarians. Journal of Information Processing and Management, 51(2),
105-115. doi:10.1241/johokanri.51.105