Evidence Summary
Information Literacy Skills Are Positively Correlated with Writing Grade
and Overall Course Performance
A Review of:
Shao, X., & Purpur, G. (2016). Effects of information literacy
skills on student writing and course performance. The Journal of Academic Librarianship, 42(6), 670-678. http://dx.doi.org/10.1016/j.acalib.2016.08.006
Reviewed by:
Rachel E. Scott
Integrated Library Systems Librarian
University Libraries
University of Memphis
Memphis, Tennessee, United States of America
Email: rescott3@memphis.edu
Received: 3 Mar. 2017 Accepted: 7 Apr.
2017
2017 Scott.
This is an Open Access article distributed under the terms of the Creative
Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
Abstract
Objective – To measure the correlation of tested information
literacy skills with individual writing scores and overall course grade.
Design – Online, multiple-choice survey.
Setting – Public research university in North Carolina,
United States of America.
Subjects – Freshmen students enrolled in either first-year
seminar (UCO1200) or basic English writing course (ENG1000).
Methods – A 25-question, forced-choice test was piloted with
30 students and measured for internal consistency using Cronbach’s Alphas. The
survey instrument was slightly revised before being administered online via
SelectSurvey, to 398 students in 19 different sections of either UCO1200 or
ENG1000, during class sessions. The test measured students’ information
literacy skills in four areas: research strategies, resource types, scholarly
vs. popular, and evaluating websites. The preliminary questions asked for each
student’s name, major (by category), number of library instruction sessions attended,
and the names of library services utilized.
The students’
information literacy scores were compared to their writing scores and overall
course grades, both of which were obtained from course instructors. The
information literacy scores were also analyzed for correlation to the number of
library instruction sessions attended or the types of library services
utilized.
Main Results – Information literacy skills positively correlated
with writing scores (n=344, r=-.153, p=0.004) and final course grades (n=345,
r=0.112, p=0.037). Pearson’s Correlation Coefficients results demonstrated
relationships between writing scores and the information literacy test section
“Scholarly versus Popular Sources” (n=344, r=0.145, p=0.007), and final grade
and information literacy test sections “Types of Sources” (n=345, r=0.124,
p=0.021) and “Website Evaluation” (n=345, r=0.117, p=0.029). The impact of
using other library services or of attending multiple information literacy
sessions was not statistically significant.
Conclusion – Students’ mastery of tested information literacy
skills directly correlates to their writing and final course grades. The study
confirms the need for faculty and library collaboration to create
well-integrated library instruction and services, and advocates for librarians
to become integral to campus initiatives for student learning and success.
Commentary
There is a growing
body of literature linking academic libraries to various measures of student
success. Megan Oakleaf has published extensively on assessing the academic
library’s contributions; a 2016 article by her focuses on librarian involvement
in institutional-level initiatives. Rockman’s
2002 paper shows that institutions have long collaborated across departments
and campuses to integrate information literacy into the general education
curriculum to support institutional goals. By investigating the correlation of
information literacy skills instruction with writing scores and overall class
grades, the authors of the study at hand provide a unique and compelling
contribution to these areas of the literature.
The “Reader’s Guide to the Literature on
Interventions Addressing the Need for Education and Training” facilitates the evaluation of a
study’s design, educational context, results, and relevance (Koufogiannakis,
Booth, & Brettle, 2006). The objective of the study was clearly
articulated. The first three research questions, analyzing the correlation of
information literacy skills to both writing skills and course grade and identifying
the key information literacy skills for both, were clearly stated and
investigated. The fourth research question, assessing the effect of library
uses on student performance, is too broad and cannot be systematically
addressed. The relevant survey question: “Library services you have used in
order to complete your writing assignments (choose all that apply)” asks
students not to apply skills, as in the rest of the test questions, but to
understand or remember library jargon (e.g., “RAP session,” “online tutorials,”
“library service desks”) (p. 675).
The teaching method,
mode of delivery, instruction topics and amount of instructional contact time
were not detailed. UCO1200 and ENG1000 sections were required to meet specific
assignment criteria to qualify for study participation. It was unclear if and
how these heightened requirements created a different educational intervention.
Name, academic class level, and major were the only demographic information
collected through the instrument, and it is unknown if the subjects were
representative of the university’s undergraduate population.
The study’s results
are clearly explained, but some details are missing. The authors did not
mention an IRB (Institutional Review Board) or a consent process. They also did
not account for the differences between the number of students completing the
information literacy skills test (p=398) and the number with available writing
scores (p=344) or final grades (p=345); presumably these students dropped the
course. The relevant data is presented and analyzed using SPSS statistical
software.
This article’s
positive contribution to the literature is the validation of its premise,
namely that information literacy skills can be learned through instruction and
use of library resources and services. The study makes a compelling argument
for the continued integration of tailored library instruction in the general
education curriculum; targeting first-year students can have a timely impact on
academic success.
References
Koufogiannakis, D., Booth, A., & Brettle,
A. (2006). ReLIANT: Reader’s guide to the literature on interventions
addressing the need for education and training. Library & Information Research, 30(94), 44-51. Retrieved from http://www.lirgjournal.org.uk/lir/ojs/index.php/lir/article/view/271/318
Oakleaf, M. (2016). Getting
ready & getting started: Academic librarian involvement in institutional
learning analytics initiatives. The
Journal of Academic Librarianship, 42(4), 472–475. http://dx.doi.org/10.1016/j.acalib.2016.05.013
Rockman, I. F. (2002).
Strengthening connections between information literacy, general education, and
assessment efforts. Library Trends, 51(2),
185-198. Retrieved from http://hdl.handle.net/2142/8465