Editorial
Waiting for Evidence
Alison Brettle
Editor-in-Chief
Senior Lecturer,
University of Salford
Email: a.brettle@salford.ac.uk
2013 Brettle. This
is an Open Access article distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 2.5 Canada (http://creativecommons.org/licenses/by-nc-sa/2.5/ca/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
As library practitioners we may often want evidence to
help us make decisions or to provide a rationale for what we do. We may then get frustrated that the research
evidence doesn’t exist or doesn’t quite match our needs. This occurred to me
when reading recent articles about searching in relation to systematic reviews
(e.g., Gehanno, et al., 2013; Nourbakhsh, et al., 2012) which in turn reminded me of one of my first forays
into evidence based librarianship. I was
working on a project investigating the feasibility of undertaking systematic
reviews in social care (Long et al., 2002a; 2002b) and one of my roles was to
identify a set of “best” databases for searching in this field (Brettle & Long, 2001).
I soon realised not only that this was a difficult task, but that one of
the differences between evidence based practice in social care and evidence
based practice in medicine (whose practice we were emulating) is that the
questions, the answers, and the evidence needed to obtain those answers are all
more messy or fuzzy than a clinical question which can be broken down by PICO
(Richardson et al., 1995). The overall conclusion of the social care project
was that, despite this messiness and fuzziness, it was still possible to adopt
a systematic approach and to identify “best” evidence (Long et al, 2002a) and
thus to undertake evidence based social care.
The same can be said about evidence based library and
information practice; the interventions that we are involved in and the
decisions we make often don’t involve cause and effect, and because of this
there won’t be a clear-cut answer. This doesn’t mean there is no evidence or
that we can’t be evidence based; it’s just that the medical hierarchy of
evidence (Guyatt et al., 1995) doesn’t fit, a point
also made by Crumley and Koufogiannakis
(2002). As in social care, we need to ensure that our view of evidence is a
broad one but as my recent reading on systematic reviews suggests, we also need
to be patient in waiting for answers. We need to think about building up a
picture of evidence for our practice rather than hoping (or expecting) that one
piece of research will provide the answers we need. My research into databases,
mentioned earlier, seemed to throw up more research questions than I answered.
I was able to provide an answer for a very specific topic but this couldn’t be
generalised for all topics or all databases, so it didn’t help me a great deal
in further searches or in teaching information literacy. My recent reading
provides additional pieces of evidence about when certain resources are more
appropriate than others, as well as generating a number of questions regarding
methodology. This incomplete picture of evidence is good news – it gives
practitioner researchers working in library and information practice plenty of
questions to investigate, which will ultimately generate a better overall picture
of evidence.
This March issue of EBLIP contains a wide variety of
research articles, evidence summaries, reviews and commentaries. I hope it
helps you build up a useful picture of the evidence you need for your practice.
References
Brettle, A. J., & Long, A. F. (2001). Comparison of bibliographic databases
for information on the rehabilitation of people with severe mental illness. Bulletin of the Medical Library Association,
89(4), 353-362.
Crumley, E. & Koufogiannakis, D. (2002). Developing evidence-based librarianship: Practical steps for
implementation. Health and Information Libraries Journal, 19(2), 61-70.
doi:10.1046/j.1471-1842.2002.00372.x
Gehanno, J.-F., Rollin, L. & Darmoni, S. (2013). Is
the coverage of google scholar enough to be used alone for systematic reviews? BMC Medical Informatics and Decision Making,
13(7), doi:10.1186/1472-6947-13-7
Guyatt, G.H., Sackett, D.L., Sinclair, J.C., Hayward, R., Cook, D.J., Cook,
R.J., Bass, E., Gerstein, H., Haynes, B., & Holbrook, A. et al. (1995).
Users' guides to the medical literature: IX. A method for grading health care
recommendations. JAMA, 274(22),
1800-1804. doi:10.1001/jama.1995.03530220066035
Long, A.F., Godfrey, M., Randall, T., Brettle,
A., & Grant, M. J. (2002a). Developing
evidence based social care policy and practice. Part 3: feasibility of
undertaking systematic reviews in social care. Leeds: University of Leeds,
Nuffield Institute for Health. University of Salford, Health Care Practice
R&D Unit.
Long, A., Godfrey, M., Randall, T., Brettle,
A., & Wistow, G. (2002b). Developing evidence based social care policy and practice. Part 1:
Effectiveness and outcomes of rehabilitation for people with severe and enduring
mental illness. Leeds: University of Leeds, Nuffield Institute for Health.
University of Salford, Health Care Practice R&D Unit.
Nourbakhsh, E. F., Nugent, R. F., Wang, H. F., Cevik, C. F., & Nugent, K. Medical literature searches:
A comparison of PubMed and Google Scholar. (2012). Health Information and
Libraries Journal, 29(3), 214-222. doi:
10.1111/j.1471-1842.2012.00992.x
Richardson, W.S., Wilson, M.C., Nishikawa, J. & Hayward. R.S.
(1995). The well built clinical question: a key to
evidence based decisions [Editorial], ACP Journal Club, 123(2), A12-13.