Evidence Summary
Multiple Sessions for Information Literacy Instruction are Associated
with Improvement in Students’ Research Abilities and Confidence
A Review of:
Henry, J., Glauner, D., & Lefoe, G. (2015). A double shot of
information literacy instruction at a community college. Community & Junior College Libraries, 21(1-2), 27-36. http://dx.doi.org/10.1080/02763915.2015.1120623
Reviewed by:
Kelley Wadson
Librarian
Bow Valley College
Calgary, Alberta, Canada
Email: kwadson@bowvalleycollege.ca
Received: 3 Mar. 2017 Accepted: 21 Apr.
2017
2017 Wadson.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To
evaluate the impact of providing multiple information literacy (IL) sessions,
instead of a single “one-shot” session, to students in face-to-face and online
English courses.
Design –
Non-experimental, using pre-test and post-test surveys for one group, and only
a post-test survey for the other group.
Setting – A
small community college in North Carolina, United States of America.
Subjects – 352
students enrolled in 2 successive 3-credit English courses, excluding those
under the age of 18, for a total of 244 participants.
Methods –
The researchers selected two English courses, ENG 111 and ENG 112, of which
most students were required to take at least one to earn a degree or
certification. After consulting with faculty, the researchers designed two
workshops for each course that integrated active and group learning techniques.
The ENG 111 workshops covered pre-searching (e.g., mind mapping and selecting
search terms) and database searching in the first session, and website analysis
and research (e.g., URLs, Google’s advanced search, and the evaluative CRAAP
test) in the second session. The ENG 112 workshops covered subject database
searching in the first session and evaluative analysis of magazine and
scholarly journal articles in the second session. Instructors provided
web-based tutorials to online course sections as a substitute for the
face-to-face sessions. Course assignments were the same for both online and
face-to-face classes.
The researchers used anonymous online
surveys. ENG 111 students completed pre-test and post-test surveys for their
two workshops during the fall 2014 semester. The surveys consisted of seven
fill-in-the-blank and multiple-choice questions measuring pre-searching,
research, and website analysis skills, and three Likert-type 1-5 rating scale
questions measuring comfort levels. ENGL 112 students completed their post-test
survey in the spring 2015 semester, which consisted of the same three 1-5
rating scale questions measuring comfort levels, to further test the
effectiveness of multiple sessions.
Main Results –
The ENG 111 pre-test survey had 244 (66.67% female and 33.33% male) respondents
and the post-test had 150 (72.37% female and 28.69% male) respondents. When
comparing results, scores increased for pre-searching, specifically
understanding of methods for brainstorming search terms (9%), and for all
measures of website analysis and research, namely understanding of library
databases (7.63%), choosing correct evaluative criteria (4.49%), recognizing
reliable top-level domains (TLDs) .edu (1.15%) and .gov (11.21%), and Google’s
advanced search (10.43%). Post-test scores decreased on the measures of
understanding of a thesis statement (7%) and narrowing a topic if there’s too
much information (6%). For comfort levels, neutral responses did not vary much,
but there was a shift in responses from “not comfortable” to “somewhat
comfortable” and “very comfortable.” Across three measures, namely getting
started with a research paper, library research skills, and writing an academic
research paper, participants’ “not comfortable” responses decreased and their
“comfortable” responses increased. The ENG 112 post-test survey had 29 (60.71%
female and 39.29% male) respondents and measured the same comfort levels. In
addition, responses showed further improvement for all three questions.
Within-subject analysis of both surveys showed slight gender variations. On
several pre-test and post-test measures, females scored lower than males in
understanding of databases, Google’s advanced search, and website analysis.
Conclusion –
The researchers conclude that expanding IL instruction from a single “one-shot”
to four sessions had a positive impact on student learning, particularly the
ability to evaluate websites and to use Google’s advanced search. Student
participants expressed increased comfort levels and confidence in their
research skills. To address decreases on the post-test survey described above,
the researchers planned to focus more on research topic narrowing and using
thesis statements alongside the research process in future IL sessions. In
terms of instructional strategy, the researchers found timing the workshops
closely with the course assignments was helpful and concluded that the use of
hands-on, interactive elements was successful in engaging and assessing
students’ understanding in the workshops.
Commentary
This article adds to the substantial body
of literature in the research areas of IL instruction in academic libraries
(Detmering, Johnson, Sproles, McClellan, & Linares, 2014). In particular,
the researchers cite considerable evidence supporting their application of
active learning and multiple instructional sessions as a substitute for the
“one-shot” instruction. Although not explicitly mentioned, it also integrates
elements of faculty-librarian collaboration and embedded librarianship, both of
which have been shown to improve the effectiveness of IL instruction (Mounce,
2010; Hamilton, 2012).
This review used the ReLIANT tool for
evaluating research on educational and training interventions in library and
information science (LIS) (Koufogiannakis, Booth, & Brettle, 2006).
According to this checklist, there are flaws in the study design and results
that affect this article’s internal validity and relevance to LIS
practitioners.
The educational context and research
instruments are mostly well-explained and appended to the report, but there is
considerable ambiguity in the study’s purpose and design. The researchers state
the article examines the effectiveness of expanding IL instruction from one to
four sessions. However, this is not formulated clearly as a research question
or statement of purpose and, perhaps consequently, the article lacks discussion
of how and why the research design was selected. Additional limitations include
a lack of advanced statistical analysis, such as cross-tabulation and the
chi-square test of statistical significance, no differentiation in data
collection methods or analysis between face-to-face and online students, and
lack of pilot testing for the surveys. There is also limited description of the
population; community colleges are generally recognized to be quite diverse in
terms of age, ethnicity, and academic level of students. Except for gender and
age, the researchers do not specify what demographic questions were included in
the surveys nor do they describe the institutions’ demographic make-up, which
could aid LIS practitioners in assessing the study’s relevance and
applicability.
The one-shot is a mainstay of information
literacy instruction and as the authors note, a well-recognized limitation is
its attempt to cover too much content. Librarians have been testing various
strategies to address this, such as more effective assessment, e-learning, and
even full-credit courses (Mery, Newby, & Peng, 2012). A key strength of this
article is its description of a strategy that is manageable in scope;
developing e-learning and full-credit courses is not an option for many
libraries, particularly smaller institutions like community colleges.
Based on its design, this article shows an
associative rather than a causal relationship. LIS practitioners may find it
helpful as a pre-experimental or case study providing descriptive insights into
faculty-librarian collaboration, active learning techniques, and the potential
for multiple sessions to lower library anxiety and bolster students’ confidence
in their research skills.
References
Detmering, R., Johnson, A. M., Sproles, C., McClellan,
S., & Linares, R. H. (2014). Library instruction and information literacy
2013. Reference Services Review, 42(4),
603-715. http://dx.doi.org/10.1108/RSR-07-2014-0028
Hamilton, B. J. (2012). Embedded librarianship: Tools and practices. Chicago, IL: ALA
TechSoure.
Koufogiannakis, D., Booth, A., & Brettle, A.
(2006). ReLIANT: Reader’s guide to the literature on interventions addressing
the need for education and training. Library
& Information Research, 30(94), 44-51.
http://www.lirgjournal.org.uk/lir/ojs/index.php/lir/index
Mery, Y., Newby, J., & Peng, K. (2012). Why
one-shot information literacy sessions are not the future of instruction: A
case for online credit courses. College
& Research Libraries, 73(4), 366-377. Retrieved from http://crl.acrl.org/index.php/crl/issue/archive
Mounce, M. (2010). Working together: Academic
librarians and faculty collaborating to improve students' information literacy
skills: A literature review 2000–2009. The
Reference Librarian, 51(4), 300-320. http://dx.doi.org/10.1080/02763877.2010.501420