Research Article
Brian Detlor
Professor
McMaster University
Hamilton, Ontario, Canada
Email: detlorb@mcmaster.ca
Alexander Serenko
Professor
Ontario Tech University
Oshawa, Ontario, Canada
Email: a.serenko@ontariotechu.ca
Tara La Rose
Professor
McMaster University
Hamilton, Ontario, Canada
Email: larost1@mcmaster.ca
Heidi Julien
Professor
University at Buffalo
Buffalo, New York, United States of America
Email: heidijul@buffalo.edu
Received: 30 Apr. 2024 Accepted: 2 July 2024
2024 Detlor,
Serenko, La Rose, and Julien. This is an Open Access article
distributed under the terms of the Creative Commons‐Attribution‐Noncommercial‐Share Alike License 4.0
International (http://creativecommons.org/licenses/by-nc-sa/4.0/),
which permits unrestricted use, distribution, and reproduction in any medium,
provided the original work is properly attributed, not used for commercial
purposes, and, if transformed, the resulting work is redistributed under the
same or similar license to this one.
DOI: 10.18438/eblip30533
Objective
– This paper presents the results from a survey of administrators and
instructors at public libraries across Canada investigating the delivery of
digital literacy training led by public libraries. The goal of the survey was
to capture a snapshot of the Canadian public library-led digital literacy
training landscape and to explore differences in perceptions of training
activities between public library administrators and instructors.
Methods – An online survey was distributed to administrators and
instructors at public libraries across Canada with the help of two national
public library associations. The survey instrument was developed based on a
theoretical framework from the research team’s prior case study investigations
of community-led digital literacy training. The survey included closed- and
open-ended questions concerning the availability of adequate/sustained funding,
the adequacy of dedicated classroom resources, the competency of teaching
staff, the helpfulness of support staff, the amount and frequency of knowledge
sharing of best practices, the amount of rigorous and regular performance
measurement, the scheduling of the training provided, the skills taught, the
pedagogical approaches used, and the marketing carried out. Responses were
analyzed using both quantitative and qualitative data analysis techniques.
Results
– Public library administrators and instructors in Canada are generally
satisfied with the delivery of digital literacy training; however, room for
improvement exists. Instructors are more positive about the delivery of this
training than administrators. Findings support and extend the research team’s
conceptual model, specifically in terms of providing more insight and clarity
on how the learning environment and program components affect the delivery of
digital literacy training led by public libraries. Results highlight how
training is situated in context and how libraries need to fine-tune the
delivery of this training in ways that are reflective of libraries’ learning
environments and program components.
Conclusion – Results are of high interest
to researchers and library practitioners who wish to leverage evidence-based
library and information practice to understand and address the factors
affecting the successful delivery of public library-led digital literacy
training. Though funding is always an obstacle for any public service
organization, libraries can make improvements to the delivery of their training
in other ways, such as carrying out more robust performance measurement and
using results more transparently, participating in more knowledge-sharing
opportunities, and better understanding learner needs and preferences.
Digital literacy
refers to the “set of skills, knowledge and attitudes required to access,
create, use, and evaluate digital information effectively, efficiently, and
ethically” (Julien, 2018, p. 2243). Digital literacy is about the awareness,
attitudes, and ability to appropriately use digital tools to manage information
in the digital age, including the ability to identify, access, manage,
evaluate, analyze, synthesize, construct, create, and communicate information
(Bawden, 2008). By these definitions, to be digitally literate implies knowing
not only how to operate digital devices (such as laptops, smartphones, and
tablets) but also how to critically assess the information accessed through
these devices.
In today’s
information society, people need to be digitally literate to fully participate
and thrive. Digital literacy can bridge the digital divide (Abdelaal &
Andrey, 2022) and is a critical competence for empowering citizenship in a
digital world (Marín & Castaneda, 2022). There are substantial benefits in
being digitally literate, including more positive health outcomes (as people
are more able to obtain high quality health information online), better access
to government services, improvements in workforce development (improved job
performance, employment), reduced social isolation, and improved protection
against online threats such as phishing scams and identity theft (Detlor et
al., 2022; Julien, 2018).
However, many
members of society – especially those from marginalized populations such as
seniors, youth at risk, newcomers, and immigrants – may lack the financial
means or necessary educational prerequisites to obtain digital literacy skills
training (Abdelaal & Andrey, 2022). Fortunately, public libraries deliver
low-barrier, very low-cost or free training that these populations can access
(Manžuch & Macevičiūtė, 2020). Public libraries provide community members
with complimentary access to digital literacy training, Wi-Fi, and various
information technologies, from basic computers to advanced tools. They often
offer diverse digital training methods, ranging from self-guided tutorials to
personalized support, group sessions, guest lectures, specialized professional
training, community partnerships, and technology-centered spaces (Barrie et
al., 2021; Julien et al., 2021; Wynia Baluk et al., 2021).
While the
provision of digital literacy training is a core public library service
(Nordicity, 2018), it is uncertain how different library personnel perceive the
delivery of this training. Administrators, who are responsible for the overall
management, operations, and fiscal health of the organization, may have
different views than those of instructors, who are responsible for the direct
face-to-face delivery of the training. As such, the study asks, “How do
public library administrators and instructors differ in their perceptions of
the digital literacy training they provide to the communities they serve?”
Answering this
question is important to public libraries as they play a pivotal role in
digital literacy promotion in their communities. Little research is available
on the provision of digital literacy training by public libraries, particularly
in terms of the factors that promote or impede successful digital literacy
training delivery. Libraries would benefit from understanding these factors,
especially in terms of the perspectives of library administrators and
instructors, as a first step towards implementing actions within their
libraries that foster successful training and mitigate or eliminate barriers
that impede delivery.
As
described in the previous section, in today’s digital age, digital literacy –
the ability to navigate, understand, and effectively use digital tools and
information – is paramount. One way to obtain digital literacy skills is
through the provision of digital literacy training. Digital literacy training
is an encompassing educational initiative aimed at equipping individuals with
the requisite skills, competencies, and knowledge required to function
confidently in a technology-driven environment (Barrie et al., 2021; Bawden,
2008).
To improve
digital literacy skills among community members, digital literacy training is
needed. However, barriers to the delivery of digital literacy training to
community members exist, especially to marginalized populations such as
seniors, at-risk youth, newcomers, and immigrants. These barriers include: a
lack of access to the Internet, data, hardware, and software; the inability to
pursue education and training opportunities due to financial, mobility and geographic
restrictions; learners not seeing themselves reflected in the digital literacy
training programs provided; intimidation and fear of failure; and insufficient
intermediate-level digital literacy training opportunities (Elfert, 2019; Huynh
& Malli, 2018; Smythe et al., 2021).
Importantly,
public libraries, as well as other local community organizations such as social
service agencies and not-for-profits, play a key role in overcoming these
barriers to digital literacy that local community members face by providing
free or low-cost training opportunities and striving to serve those from
marginalized groups who may not have access to such training typically
accessible to more privileged counterparts (Julien et al., 2021).
Public libraries
play a key role in the promotion of digital literacy skills to the communities
they serve. They offer communities free access to digital literacy training,
Wi-Fi, computers, and tablets, as well as more sophisticated technologies
(Detlor et al., 2022). They provide digital literacy training and support for
both beginners and those with more advanced skills. This can take a variety of
forms, such as self-guided help, one-to-one support, group, one-off, and
multi-event informational sessions, guest speakers, digital training targeted
for specific professions, partnerships with community organizations, and
makerspaces or technology learning hubs (Julien et al., 2021). Furthermore,
public libraries strive to create a welcoming social space where many people
feel comfortable asking questions (Barrie et al., 2021; Wynia Baluk et al.,
2021).
In this way,
public libraries are at the forefront of addressing the digital divide,
offering targeted digital training and services for those who are marginalized
(Wynia Baluk et al., 2021). For example, in Andrey et al.’s (2021) multi-method
online and phone survey of 2,500 Toronto residents to better understand
Internet and device access, “42% of those in Toronto without home Internet use
the public library for access, compared to 16% overall” (p. ix).
By promoting
digital literacy and inclusion, public libraries help make information, skill
development, education, social media, and resources accessible to marginalized
populations (Wynia Baluk et al., 2023). Further, by providing local community
members with free or low-cost digital skills training, public libraries have
become important community digital learning hubs (Nordicity, 2018).
Contributions to the economic health of communities and the financial success
of individuals are major reasons why public libraries teach digital skills
(Horrigan, 2015; Public Library Association, 2024). The public wants libraries
to teach digital literacy skills and supports public libraries’ efforts to help
vulnerable populations in this regard (Horrigan, 2015).
Despite the
benefits to community members regarding the digital literacy training they
receive from public libraries, certain challenges confront public libraries in
offering this service. For example, challenges arise from a lack of resources,
including staff time, and limited staff expertise, as well as competition for
learners’ time (Julien et al., 2021). Further, though public library
administrators and instructors espouse idealistic intentions with their digital
literacy programs, particularly to give marginalized people increased
educational and vocational opportunities, and are confident in the success of
these programs, little formal assessment or evaluation of the learning outcomes
of these digital literacy training programs occurs (Julien et al., 2022). There
is a need to more fully and systematically evaluate the outcomes of the digital
literacy training programs that public libraries provide to assess whether
program goals are being met and ongoing investment of resources is merited
(Julien et al., 2022).
One way that
public libraries are successfully mitigating the challenges of delivering
digital literacy training is through partnerships with other community
organizations (Wynia et al., 2021). For example, a recent case study in London,
Ontario of a partnership between two public library systems – a volunteer
seniors’ organization and a seniors’ centre – demonstrates the benefits and
challenges of a community partnership approach to the delivery of digital
literacy training (Elgamal et al., 2024). Benefits identified in that case
study included mitigating access to social network constraints and providing
more personalized, socially engaging, and flexible digital literacy training,
while challenges pertained to overcoming inherent tensions over differences in
organizational structures and ways of working.
Information
literacy and educational assessment theories (Boyer & Ewell, 1988;
Lindauer, 2004; Sims, 1992) propose that the learning environment in which
instruction occurs and program components (i.e., the specific features of the
instruction itself) collectively influence learning outcomes. Empirical
investigations of information literacy instruction given by academic librarians
to university students support the notion of a cause-and-effect relationship
between instructional training factors (i.e., the learning environment, program
components) and learning outcomes (i.e., psychological, behavioural, and
benefit outcomes; Detlor et al., 2011; Serenko et al., 2012).
The need to
contextualize the delivery of instruction in terms of the learning environment
and program components is supported by situated learning theory. According to
situated learning theory, learning is situated in context (Lave, 2009) and
there is a need to provide best practice in situated-learning environments
(Brown, 2006; Brown et al., 1989). For example, Kurt (2021) identifies specific
“situated learning” guidelines when designing the delivery of instruction: i)
learners should be presented with realistic and relevant problems to solve; ii)
instructors should serve as facilitators or coaches rather than as lecturers;
iii) learning should promote reflection, discussion, and evaluative thinking
where learners are actively engaged; and iv) the content of a course should not
comprise neat packages of information taught by an instructor, but rather
involve contextual and real-life learning activities.
Using this
theoretical background as a guide, the authors of this paper conducted
exploratory case study investigations of community-led digital literacy
training (Detlor et al., 2022). Results from that study led to the generation
of a theoretical framework where the learning environment and program
components were shown to influence learning outcomes that impacted overall
digital literacy training success. Specifically, factors of the learning
environment such as resources, budgets, and performance measurement affected
learning outcomes. For example, a lack of skilled instructors or up-to-date
computer labs hindered constructive teaching. Restricted budgets limited what
type of training could be offered and how that training could be delivered.
Instructional programs that were rigorously and regularly evaluated led to the
delivery of higher-quality training. Program components, such as the length of
a training session, the amount of material delivered, the type of skills
taught, the amount of experiential learning, and the timing of the instruction
impacted the delivery of the training. For instance, having the length of a
training session match the time required for students to comprehend and master
the material being delivered led to positive learning outcomes. When the skills
taught were most likely to improve a student’s life, when interactive, hands-on
learning activities were provided, and when teaching times were convenient to
students, better learning outcomes occurred.
Figure 1
presents a conceptual model based on the theoretical framework generated from
the research team’s prior case study investigations of community-led digital
literacy training (Detlor et al., 2022).
Figure 1
Conceptual model
(adapted from Detlor et al., 2022).
As
Figure 1 illustrates, both the learning environment and program components
impact digital literacy training delivery. The learning environment represents
the learning context surrounding the delivery of instruction and includes
factors such as program funding, classroom resources, teaching staff, support
staff, knowledge sharing, and performance measurement. The following
propositions are implied:
-
[P1]: The greater the availability of
sustained program funding, the better the delivery of the digital literacy
training provided.
-
[P2]: The greater the adequacy of
dedicated classroom resources, the better the delivery of the digital literacy
training provided.
-
[P3]: The greater the provision of
proficient and sustainable teaching staff, the better the delivery of the
digital literacy training provided.
-
[P4]: The greater the provision of
proficient and sustainable support staff, the better the delivery of the
digital literacy training provided.
-
[P5]: The greater the amount and
frequency of knowledge sharing of best practices, the better the delivery of
the digital literacy training provided.
-
[P6]: The greater the amount of rigorous
and regular program performance measurement, the better the delivery of the
digital literacy training provided.
Program
components represent specific features of the instruction itself and include
factors such as scheduling, skills taught, pedagogical approach, and marketing.
The following propositions are implied:
-
[P7]: The better the scheduling of instruction
matches learner needs, the better the delivery of the digital literacy training
provided.
-
[P8]: The better the skills taught matches
learner needs, the better the delivery of the digital literacy training
provided.
-
[P9]: The better the pedagogical approach
matches learner needs, the better the delivery of the digital literacy training
provided.
-
[P10]: The greater the depth and breadth of
marketing, the better the delivery of the digital literacy training provided.
The
study’s research question asks how public library administrators and
instructors differ in their perceptions of the digital literacy training they
provide to the communities they serve. To answer this research question, an
online survey to public libraries across Canada was administered via Qualtrics.
The survey was based on the study’s conceptual model (Figure 1). The survey
polled library administrators and instructors about their perceptions of their
library’s delivery of digital literacy training to the public, specifically in
terms of factors of the learning environment and program components.
The
online survey took approximately 20 minutes to complete and included closed-
and open-ended questions concerning the availability of adequate sustained
funding, the adequacy of dedicated classroom resources, the competency of
teaching staff, the helpfulness of support staff, the amount and frequency of
knowledge sharing of best practices, the amount of rigorous and regular
performance measurement, the scheduling of the training provided, the skills
taught, the pedagogical approaches used, and the marketing carried out (see the
Appendix). Both English and French versions of the survey were created.
A
pilot test was conducted at one public library to assess the validity of the
survey instrument prior to the roll-out of the full survey to public libraries
across Canada. As a result of the pilot test, a couple of survey questions were
dropped if: i) they pertained to a library’s statistics (e.g., collection size;
number of library users) and this data could be gathered by other means (e.g.,
via a library’s annual report); and ii) the answers to these questions varied
considerably among survey respondents from the same library. Some pilot test
participants indicated that some closed-ended questions were too similar in
what was being asked. These questions were re-worded to make them more discrete
and different from one another.
Two
national public library associations – the Canadian Urban Libraries Council and
the Canadian Federation of Library Associations – assisted with the recruitment
of participants from their memberships. Three separate rounds of recruitment
occurred where emails soliciting participation were sent directly to public
library personnel. Each participant was eligible to receive a twenty-dollar
online Amazon gift card as an incentive to fill out a survey. This funding was
made possible through a grant from the Social Sciences and Humanities Research
Council of Canada. Email address information collected from participants to
distribute the Amazon gift cards was stored separately from participants’
survey responses.
Descriptive
statistics were used to analyze numerical responses. Because of small and
unequal sample sizes between administrators and instructors, the data did not
meet the requirements of parametric statistics. Therefore, the Mann-Whitney U
non-parametric test was used to examine the difference between variables for
administrators and instructors. Thematic analysis (Miles et al., 2014) was
employed to discover and investigate themes in the textual answers provided to
open-ended questions.
In
total, 45 respondents (i.e., 29 administrators and 16 instructors) completed
the survey. These respondents represented approximately 20 different public
libraries across Canada (not all respondents identified their library in the
survey responses).
Though
a larger response rate would have been preferred, a diverse and representative
sample of public libraries of varying sizes across different regional areas of
Canada was obtained. The public libraries surveyed comprised both rural and
urban libraries, were situated in different provinces across the country, were
of varying sizes (e.g., large, medium, small), offered a range of different
digital literacy training offerings (some more elaborate and extensive than
others), had varying numbers of library personnel dedicated to digital literacy
training, had different amounts of training resources available (e.g.,
classroom space; computer equipment), and had different levels of funding
available for digital literacy training (some more extensive than others). In
this way, the sample was representative of the general diversity of public
libraries across Canada, which span different geographic regions, rural and
urban settings, library sizes, as well as staff, resources, and budgets
allocated to the delivery of digital literacy training.
Many
participants gave detailed responses to open-ended questions asked in the
survey; these provided much needed context to interpret answers to the
closed-ended questions.
Of
those respondents who reported demographic information, an almost equal number
of men (12) and women (14) administrators responded. More instructors
identified as women (9) than men (6). Gender inclusive options were provided
within the demographic questions; however, no participants self-identified as
trans or non-binary in these responses.
The
average age of administrators and instructors was 42 and 35 years respectively.
Administrators and instructors had 15 and 7 years of work experience in the
library field respectively. Most administrators possessed a graduate (21) vs.
an undergraduate (4) degree, while a more even split between graduate (9) and
undergraduate (6) degrees was reported by instructors. Statistical
non-parametric analyses revealed no differences between groups in terms of
gender, age, work experience, and education at p < 0.05.
Overall,
survey respondents were generally satisfied with program funding, classroom
resources, teaching staff, support staff, knowledge sharing, and performance
measurement with respect to the delivery of digital literacy training in their
libraries.
Results
showed that, on average, instructors viewed program funding more
favorably than administrators (U = 321.5, median instructors
= 6.00, median administrators = 5.00, p = 0.032). Analysis of
qualitative comments suggests that administrators, who are tasked with library
oversight and budgetary responsibilities, are more acutely aware of financial
constraints that impact sufficient program funding – something instructors may
not fully appreciate. For example, as one administrator noted, “Funding is
insufficient, but it is stable… proper funding for digital literacy requires a
healthy budget for materials as well as staff and their time.” Instructors,
on the other hand, were more apt to focus on what could be done with the
funding that is provided and think of creative ways to make do, rather than
focus on what could be done if more funding were available. For example, one
instructor stated, “No funding is required. I use devices the library
already owns for demonstrations, and I advise patrons to bring their own
devices for assistance.”
Results
showed that, on average, instructors viewed classroom resources more
favorably than administrators (U = 346.0, median instructors
= 6.00, median administrators = 5.00, p = 0.006). This
difference stems from instructors’ direct engagement with classroom resources,
leading to a greater appreciation of them. For instance, one instructor noted,
“The computer lab is good and works well for the basic classes we offer,”
highlighting satisfaction with classroom resources for their immediate teaching
needs. By contrast, administrators were more aware of general institutional
classroom resource needs and the challenges in maintaining or updating
classrooms, as reflected in one administrator’s comment: “The library
received a large batch of classroom resources ~5 years ago. Some of these are
broken through wear and tear.” Interestingly, all administrators (100%)
commented on classroom spatial constraints, whereas no instructors did. As one
administrator pointedly stated, “We currently don't have dedicated space for
digital literacy instruction. Programming in this location is similar to
scramble parking,” underscoring the general lack of adequate classroom
facilities in public libraries and the keen awareness of this limitation by
administrators. As administrators need to routinely manage classroom allocation
and solve corresponding problems (e.g., find suitable rooms), whereas
instructors are always assigned a room, it is understandable that obtaining
suitable classrooms is less of an issue for instructors.
In
terms of teaching staff, most instructors had many years of teaching
experience, but as one instructor noted, teaching skills were achieved on the
job and not through formal education, “Over 20 years I think. I was not
trained to be a teacher though. Previous jobs have put me in a position where I
had to train.” Instructors noted challenges including a lack of time to
prepare and learners’ lack of English or French (the language of instruction)
when carrying out their teaching. Administrators rated their instructors highly
but acknowledged that while the instructors were knowledgeable about the
content they taught, they could benefit by being more pedagogically prepared.
Because closed-ended teaching staff questions were presented to library
administrators only, no statistical analysis was done comparing views on
teaching staff between instructors and library administrators.
No
statistically significant differences between instructors and administrators
with respect to their views of support staff were observed (U =
168.5, median instructors = 4.67, median administrators =
5.67, p = 0.126). An examination of the qualitative comments showed that
though both administrators and instructors shared similar concerns regarding
staffing challenges, instructors were less confident of the competence of their
current support staff and were more critical of the challenges associated with
increasing the competence of support staff. Further, instructors were less
positive about the collaborative and resourceful support received from support
staff. Instructors’ harsher perceptions of support staff in the delivery of
digital literacy may be due to the direct operational dependencies required by
instructors of support staff to help facilitate digital literacy training.
No
statistically significant differences were found between instructors and
administrators with respect to their views of internal knowledge sharing
(U = 261.0, median instructors = 5.67, median administrators
= 5.67, p = 0.485) and external knowledge sharing (U =
241.0, median instructors = 4.00, median administrators =
4.00, p = 0.829). Internal knowledge sharing refers to the sharing of
knowledge about the delivery of digital literacy instruction within one’s
library branch among instructors, administrators, and staff. External knowledge
sharing pertains to the sharing of knowledge about the delivery of digital
literacy instruction with outside stakeholders (e.g., other libraries,
community agencies).
The
Wilcoxon Signed-Rank Test (i.e., the non-parametric equivalent of the paired t-test)
revealed that respondents rated internal knowledge sharing much higher than
external knowledge sharing (Z=-2.914, median internal = 5.67,
median external = 4.00, p = 0.004). In their open-ended
comments, they further described that knowledge sharing about the training
happens internally but could be improved, and that external knowledge sharing
with other public libraries happens less often, but when it does, it is found
by instructors to be extremely valuable. Knowledge about how a course is taught
is typically shared informally, via word-of-mouth. For example, one instructor
explained how they regularly inform other colleagues in their library about the
courses they teach, “I make consistent efforts to educate my fellow staff
and my manager about what courses I'm offering, what they entail, and invite my
colleagues, where available, to sit in on or even co-lead courses with me.”
Around
half of respondents indicated that their library employed performance measures
(i.e., the evaluation of the digital literacy training provided including
satisfaction surveys and participant feedback) to assess the quality of digital
literacy instruction. When performance measures were used, respondents believed
that such measures were generally helpful, comprehensive, and applied
consistently. When performance measures were not employed, about two-thirds of
respondents indicated that their library used to collect such measures in the
past, but this practice was later discontinued.
On
the whole, survey respondents consistently reported that performance
measurement primarily involved the collection of rudimentary statistics
(e.g., attendance) and the administration of short evaluation questionnaires.
However, this evaluation work was not done on a consistent basis. As one
administrator noted, “Post-attendance surveys are delivered to learners
randomly rather than consistently in our system,” highlighting the
irregular rhythm of feedback collection. Instructors were more optimistic than
administrators about the utility of the performance metrics collected, largely
due to the fact that instructors were more likely to use the feedback received
to make changes to their own training. Administrators, by contrast, were
concerned about the lack of use of performance measurement data for
higher-level library decision making. For example, according to one
administrator, “I don't know how the library actually uses the information
they get from surveys,” highlighting the lack of certainty among administrators
regarding performance metrics.
Survey
respondents were generally satisfied with their libraries’ scheduling of the
training, the skills taught, the pedagogical approaches used, and the marketing
approaches utilized concerning the delivery of digital literacy training in
their libraries.
Results
showed that, on average, instructors viewed the scheduling of digital
literacy training sessions more favorably than administrators (U =
327.5, median instructors = 6.00, median administrators =
5.33, p = 0.011).While both administrators and instructors were aware of
the importance of scheduling in influencing the uptake of digital literacy
training sessions by library users, only the instructors – and all 100% of them
– reported the need to tailor the delivery of the training to participant
availability. For example, one instructor wrote, “We provided this course at
varying times and dates throughout the spring and summer to accommodate
different schedules.” None of the administrators mentioned the need to
adapt the timing of the instruction to match the needs of learners.
Instructors’ closer connection to the direct impact of scheduling on
participants likely led to a greater awareness of the need to tailor the
delivery of the training to participant availability. Administrators, who
manage broader organizational and operational concerns, may be less sensitive
to the need for, and importance of, scheduling training sessions that best suit
the needs of learners.
Results
showed that, on average, instructors viewed the relevance of the skills
taught in the digital literacy training sessions more favorably than
administrators (U = 318.5, median instructors = 6.00, median administrators
= 5.50, p = 0.016). Administrators were more critical of the lack of
advanced digital skills training offered in their libraries. For example, one
administrator said, “We could offer more advanced skills; however, with
limited staff devoted specifically to digital literacy it is hard to offer a
robust programming schedule.” Another administrator stated, “I hope we
can soon offer more programs that reach more levels of digital literacy.”
By contrast, instructors were more appreciative of the focus and delivery of
basic digital literacy skills training offered by their libraries. For
instance, many instructors described how in their training they “begin from
the premise that you’ve never seen a computer before,” highlighting a
commitment to teach essential skills. One instructor, commenting on the
adequacy of skills taught, stated: “…I do think they were appropriate for
the subject and intended audience experience level.” Noteworthy was the
fact that only 14.7% of administrators versus 85.3% of instructors discussed
the use of specialized technology in the training. One instructor boasted, “participants
are learning industry-standard digital literacy skills in the areas of
photography, filming, music recording and digital design,” indicating a
high regard for the specialized skills taught within their library training
program. Interestingly, administrators were much more concerned than instructors
about the outreach challenges of getting participation from marginalized
groups.
Results
showed that, on average, instructors viewed the pedagogical approaches used
in the digital literacy training sessions more favorably than administrators (U
= 310.5, median instructors = 6.00, median administrators
= 5.33, p = 0.031). Instructors felt they were able to adequately teach
digital literacy basics, but acknowledged there was room to expand. As one
instructor pointed out, “In a one-hour course, there's not a whole lot of
room to extend far beyond basic demonstrative approaches,” which supports
the notion that while foundations are covered, there is a desire for deeper
engagement. Instructors were generally appreciative of the autonomy they had to
modify the course. For instance, one instructor commented, “When I offer
this course, I feel confident that I'm able to tailor it to the needs of the
group of attendees”; while another said, “I'm able to adapt the content
on the fly to be as relevant as possible for members.” Instructors also
commented, much more than administrators, on the importance of incorporating
hands-on learning into the training. Instructors’ comments, like “Everything
is hands-on and one-on-one” and “I would like to have more hands-on
practice in the next session,” demonstrate a commitment among instructors
to offer engaged, active learning. Administrators, on the other hand, were very
concerned about incorporating learner needs into the training rather than a
focus on teaching techniques. For example, all administrators placed high value
on participant motivation as part of an effective pedagogy. By contrast, no
instructor was concerned about incorporating learner needs into the training,
possibly indicating a focus by instructors on pedagogical methods rather than
on participant motivation in the learning.
No
statistically significant differences were found between instructors and
administrators in terms of their views of the effectiveness, timeliness, and
comprehensiveness of marketing approaches (U = 240.5, median instructors
= 4.83, median administrators = 4.50, p = 0.685). Overall,
survey respondents consistently reported how marketing was satisfactory
in that current traditional marketing approaches (e.g., “we use things like
our events calendar on our website”) fill classrooms. Little promotion is
needed to fill registration. According to the open-ended responses,
administrators were particularly concerned – much more than instructors – that
current approaches to advertising tend to reach only current library patrons
and not those who do not visit the library. Administrators recognize the
difficulties in reaching underserved and hard-to-reach populations who may not
be on social media, read a newspaper, or watch local television. By contrast,
instructors were more concerned than administrators about the ineffectiveness
or complete absence of marketing strategies to promote digital literacy
training in their libraries. According to one instructor, tight program
scheduling due to constant staff changes impacts the ability to successfully
market the training ahead of time.
The
findings presented above indicate a general level of satisfaction among public
library administrators and instructors with the delivery of digital literacy
training, though there is room for improvement. Classroom funding is adequate
for current training needs, but, if more funding were available, then libraries
would be able to offer more frequent training, a more robust curriculum that
comprises a more varied set of courses at both basic and advanced levels, and
better classroom resources including up-to-date information technologies.
Pedagogical training of library instructors would be beneficial as well. A more
concerted effort with internal and external knowledge sharing would help spur
innovation, boost creativity, and encourage best practices in the delivery of
digital literacy training within libraries. Performance measurement is
under-utilized; better recording and analysis of evaluation metrics would go a
long way in assessing the delivery of the training and identifying areas for
improvement. The scheduling of courses could be better aligned to learners’
timing preferences. Allowing instructors to have flexibility in their teaching
approaches and encouraging active “hands-on” learning opportunities are key
success factors in the delivery of the training. To improve, more focus could
be placed on understanding and meeting learner needs, especially in terms of
the content taught. More comprehensive advertising approaches targeting a
broader range of constituents would increase reach, especially to those who
traditionally do not visit the library.
Importantly,
differences were identified between administrators and instructors in the
delivery of digital literacy training. Instructors have a more positive
perception of the operational aspects of the training such as funding,
classroom resources, scheduling, skills taught, and pedagogical approach.
Instructors’ closer engagement with program delivery may positively influence
these perceptions. Administrators, on the other hand,
have a broader focus on the health and sustainability of the training in general
to ensure that it meets community needs and is accessible to all community
members while being less concerned about specific training details.
These findings
support and extend earlier work in this area (e.g., Barrie et al., 2021;
Elgamal et al., 2024; Julien et al., 2021; Julien et al., 2022; Wynia et al.,
2023), specifically
in terms of providing more insight and clarity on how
the learning environment and program components affect the delivery of digital
literacy training led by community organizations, such as public libraries
(Detlor et al., 2022). For instance, public
libraries can provide instructors with better training opportunities so they
can be more up to date on the technical aspects of the training they provide,
as well as the best pedagogical approaches to utilize. Public libraries can
take a more proactive approach to acquire sustainable funding through
examination of new and sustainable funding models for digital literacy
training. Public libraries can offer training at more convenient times to
learners. Public libraries can better market the training as current marketing
methods primarily only secure attention to those who traditionally visit a
public library. They can also better share best practices learned in the
delivery of the training to both internal and external audiences. Better
collection and analysis of training performance metrics are also needed.
Currently, minimal performance measurement occurs; there is ample room to
collect more extensive and richer quantitative and qualitative metrics.
Results
also highlight how training is situated in context (Lave, 2009), and how
libraries need to fine-tune the delivery of this training in ways that are
reflective of libraries’ situated learning environments (Brown, 2006; Brown et
al., 1989). For example, as recommended by Kurt (2021) and this study’s
findings, the delivery of digital literacy training could be improved by having
instructors provide learners with real-life “hands-on” learning activities
where learners are actively engaged with and interact with training content of
interest and relevance to learners.
This
study is constrained by certain limitations, namely a relatively low
participation rate of libraries. There are 642 public library systems in Canada
(Bush, 2024) while this study comprised approximately 20 different public
libraries across the country. Though a diverse and representative sample was
obtained, a greater number of libraries, as well as individual respondents,
would have been preferred. A larger number and more balanced (i.e., equal)
number of administrators and instructors would have allowed the use of
parametric statistics in the analysis of the data. Despite these limitations, a
sufficient amount of quantitative and qualitative data was obtained to conduct
a robust analysis and elicit findings.
In
addition to obtaining a larger survey sample, future studies would also benefit
from administering surveys to learners (i.e., end-users, students) who received
the digital literacy training, in order to assess the delivery of the training
from a learner’s perspective. The current study only captures administrator and
instructor perspectives of digital literacy training delivery.
The
generalizability of the study’s findings may only apply to public libraries in
Canada and other similar jurisdictions, such as the United States. Future
studies may wish to administer the survey in other countries to see if similar
or different findings result. Other countries that offer more sustainable
digital literacy training to community members, such as Scandinavian countries,
may provide better and more stable funding to local community organizations for
digital literacy training, and thus their experience and impact rolling out
digital literacy training programs to local community members may differ. This,
however, needs to be studied and verified.
Last,
the focus on digital literacy training specifically, and not other types of
training a library provides, can be considered a limitation of the study in
that the study’s findings may not equally or necessarily pertain to the
delivery of other training content.
Despite
the study’s limitations, survey results are of high interest to researchers and
library practitioners who wish to leverage evidence-based library and
information practice to understand and address the factors affecting the
successful delivery of public library-led digital literacy training. Though
funding is always an obstacle for any public service organization, libraries
can make improvements to the delivery of their training in other ways, such as
carrying out more robust performance measurement and using results more
transparently, participating in more knowledge sharing opportunities, and
better understanding learner needs and preferences.
This
paper reports findings from an online survey to administrators and instructors
at public libraries across Canada concerning the delivery of digital literacy
training led by public libraries. The goal was to obtain a snapshot of the
Canadian public library-led digital literacy training landscape and explore
differences in perceptions of the training between public library
administrators and instructors. The survey was based on a conceptual model from
the research team’s prior case study investigations of community-led digital
literacy training. The survey asked questions concerning the learning
environment and program components, and their combined influence on the
delivery of instruction.
Results
indicate that public library administrators and instructors in Canada are
generally satisfied with the delivery of digital literacy training; however,
room for improvement exists. Instructors are more positive about the delivery
of this training than administrators.
Importantly,
findings support and extend the research team’s conceptual model, specifically
in terms of providing more insight and clarity on how the learning environment
and program components affect the delivery of digital literacy training led by
public libraries. Results highlight how training is situated in context and how
libraries need to fine-tune the delivery of this training in ways that are
reflective of libraries’ learning environments and program components.
Brian Detlor:
Conceptualization (equal), Methodology (equal), Qualitative Analysis, Writing –
original draft, Writing – review & editing Alexander Serenko:
Conceptualization (equal), Methodology (equal), Quantitative Analysis, Writing
– review & editing Tara La Rose: Conceptualization (equal),
Methodology (equal), Writing – review & editing Heidi Julien:
Conceptualization (equal), Methodology (equal), Writing – review & editing
This
study is partially supported by a grant from the Social Sciences and Humanities
Research Council of Canada.
Abdelaal, N.,
& Andrey, S. (2022). Overcoming digital divides: What we heard and
recommendations. Ryerson Leadership Lab. https://dais.ca/reports/overcoming-digital-divides/
Andrey, S.,
Masoodi, M. J., Malli, N., & Dorkenoo, S. (2021). Mapping Toronto’s
digital divide. Ryerson Leadership Lab and Brookfield Institute for
Innovation + Entrepreneurship. https://dais.ca/reports/mapping-torontos-digital-divide/
Barrie, H., La
Rose, T., Detlor, B., Julien, H., & Serenko, A. (2021). “Because I’m old”:
The role of ageism in older adults’ experiences of digital literacy training in
public libraries. Journal of Technology in Human Services, 39(4),
379–404. https://doi.org/10.1080/15228835.2021.1962477
Bawden, D.
(2008). Origins and concepts of digital literacy. In C. Lankshear, & M.
Knobel (Eds.), Digital literacies: Concepts, policies, and practices
(pp. 17–32). Peter Lang.
Boyer, C. M.,
& Ewell, P. T. (1988). State-based case studies of assessment
initiatives in undergraduate education: Chronology of critical points.
Education Commission of the States.
Brown, J. S.
(2006). New learning environments for the 21st century: Exploring
the edge. Change: The Magazine of Higher Learning, 38(5), 18–24.
Brown, J. S.,
Collins, A., & Duguid, P. (1989). Situated cognition and the culture of
learning. Educational Researcher, 18(1), 32–42.
Bush, O. (2024).
Public library statistics in Canada. Made in CA. https://madeinca.ca/public-library-statistics-canada/
Detlor, B.,
Julien, H., La Rose, T., & Serenko, A. (2022). Community‐led digital
literacy training: Toward a conceptual framework. Journal of the Association for Information Science and Technology
73(10), 1387–1400. https://doi.org/10.1002/asi.24639
Detlor, B.,
Julien, H., Willson, R., Serenko, A., & Lavallee, M. (2011). Learning
outcomes of information literacy instruction at business schools. Journal of the Association for Information
Science and Technology, 62(3),
572–585. https://doi.org/10.1002/asi.21474
Elfert, M.
(2019). Lifelong learning in Sustainable Development Goal 4: What does it mean
for UNESCO’s rights-based approach to adult learning and education? International
Review of Education, 65(4), 537–556. https://doi.org/10.1007/s11159-019-09788-z
Elgamal, R., La
Rose, T., Detlor B., Julien, H., & Serenko, A. (2024). A community
partnership approach to digital literacy training for older adults between
public libraries and seniors’ organizations. Canadian Journal of Information
and Library Science, 47(1), 3–17. https://doi.org/10.5206/cjils-rcsib.v47i1.16593
Horrigan, J. B.
(2015). Libraries at the crossroads. Pew Research Center. http://www.pewinternet.org/2015/09/15/libraries-at-the-crossroads/
Huynh, A., &
Malli, N. (2018). Levelling up: The quest for digital literacy.
Brookfield Institute for Innovation + Entrepreneurship. https://dais.ca/reports/levelling-up/
Julien, H.
(2018). Digital literacy in theory and practice. In M. Khosrow-Pour (Ed.), Encyclopedia of information science and
technology (4th ed., pp. 2243–2252). IGI Global.
Julien, H.,
Gerstle D., Detlor, B., La Rose, T., & Serenko, A. (2021). Digital literacy
training for Canadians. Part I: “It’s just core public works.” Library Quarterly, 91(4), 437–456. https://doi.org/10.1086/715918
Julien, H.,
Gerstle D., Detlor, B., La Rose, T., & Serenko, A. (2022). Digital literacy
training for Canadians. Part II:
Defining and measuring success. Library
Quarterly, 92(1), 87–100. https://doi.org/10.1086/717233
Kurt, S. (2021,
February 17). Situated learning theory. Educational Technology. https://educationaltechnology.net/situated-learning-theory/
Lave, J. (2009).
The practice of learning. In K. Illeris (Ed.), Contemporary theories of
learning (1st ed., pp. 200–208). Routledge.
Lindauer, B.G.
(2004). The three arenas of information literacy assessment. Reference & User Services Quarterly, 44(2),
122–129.
Manžuch, Z.,
& Macevičiūtė, E. (2020). Getting ready to reduce the digital divide:
Scenarios of Lithuanian public libraries. Journal of the Association for
Information Science and Technology, 71(10), 1205–1217. https://doi.org/10.1002/asi.24324
Marín, V. I.,
& Castaneda, L. (2022). Developing digital literacy for teaching and
learning. In O. Zawacki-Richter & I. Jung (Eds.), Handbook of open,
distance and digital education. Springer. https://doi.org/10.1007/978-981-19-0351-9_64-1
Miles,
M. B., Huberman, A. M., & Saldana, J. (2014). Qualitative data analysis:
A methods sourcebook. Sage.
Nordicity.
(2018). Technology access in public libraries: Outcomes and impacts for
Ontario communities. Toronto Public Library. https://www.torontopubliclibrary.ca/content/bridge/pdfs/nordicity-full-report.pdf
Public Library
Association (2024). Digital literacy.
American Library Association. Retrieved June 2024, from http://www.ala.org/pla/initiatives/digitalliteracy
Serenko, A.,
Detlor, B., Julien, H., & Booker, L. (2012). A model of student learning
outcomes of information literacy instruction in a business school. Journal of the American Society for
Information Science and Technology, 63(4), 671–686. https://doi.org/10.1002/asi.22606
Sims, S. J.
(1992). Student outcomes assessment: A historical review and guide to
program development. Greenwood Press.
Smythe, S.,
Wilbur, A., & Hunter, E. (2021). Inventive pedagogies and social
solidarity: The work of community‑based adult educators during COVID‑19 in
British Columbia, Canada. International Review of Education, 67(1–2),
9–29. https://doi.org/10.1007/s11159-021-09882-1
Wynia
Baluk, K., Detlor, B., La Rose, T., & Alfaro-Laganse, C. (2023). Exploring
the digital literacy needs and training preferences of older adults living in
affordable housing. Journal of Technology in Human Services, 41(3),
203–229. https://doi.org/10.1080/15228835.2023.2239310
Wynia Baluk, K., McQuire, S., Gillett,
J., & Wyatt, D. (2021). Aging in a digital society: Exploring how Canadian
and Australian public library systems program for older adults. Public
Library Quarterly, 40(6), 521–539. https://doi.org/10.1080/01616846.2020.1811612
Digital Literacy Training Survey to Administrators
and Instructors at Public Libraries
Instructions
The
questions below pertain to your library branch. Course participants are defined
as individuals who take digital literacy instruction offered by your library.
Instructors are defined as individuals who teach digital literacy instruction
at your library. Please answer all questions below to the best of your
knowledge. Please answer all questions as per the current context of digital
literacy instruction at your library.
Library Branch Information
What
is the name of your library? (open-ended)
What
is the name of your library branch? (open-ended)
Library Branch Learning Environment
The
amount of funding allocated by my library branch for digital
literacy instruction is: (7-point Likert-type scale from 1 strongly disagree to
7 strongly agree).
a)
Sufficient.
b)
Sustainable.
c)
Flexible.
Please
share your thoughts on the amount of funding allocated by your library branch
for digital literacy instruction. (open-ended)
Classroom resources provided by my
library branch for digital literacy instruction are: (7-point Likert-type scale
from 1 strongly disagree to 7 strongly agree).
a)
Adequate.
b)
Sustainable.
c)
Up-to-date.
Please
share your thoughts on the classroom resources provided by your library branch
for digital literacy instruction. (open-ended)
[Teaching
staff questions only posed to library administrators]
Teaching staff (e.g.,
instructors) provided by my library branch for digital literacy instruction
are: (7-point Likert-type scale from 1 strongly disagree to 7 strongly agree).
a)
Pedagogically prepared.
b)
Knowledgeable in the topic of their teaching.
c)
Available.
Please
share your thoughts on the teaching staff provided by your library branch for
digital literacy instruction. (open-ended)
Support staff (e.g., admin
assistants, help desk) provided by my library branch for digital literacy
instruction are: (7-point Likert-type scale from 1 strongly disagree to 7
strongly agree).
a)
Helpful.
b)
Knowledgeable.
c)
Available.
Please
share your thoughts on the support staff provided by your library branch for
digital literacy instruction. (open-ended)
Knowledge sharing about the
delivery of digital literacy instruction at my library branch among
instructors, administrators, and staff is: (7-point Likert-type scale from 1
strongly disagree to 7 strongly agree).
a)
Useful.
b)
Commonplace.
c)
Effective.
Please
share your thoughts on knowledge sharing about the delivery of digital literacy
instruction at your library branch among instructors, administrators, and
staff. (open-ended)
Knowledge sharing about the
delivery of digital literacy instruction with external stakeholders (e.g.,
other libraries, community agencies) is: (7-point Likert-type scale from 1
strongly disagree to 7 strongly agree).
a)
Useful.
b)
Commonplace.
c)
Effective.
Please
share your thoughts on knowledge sharing about the delivery of digital literacy
instruction with external stakeholders (e.g., other libraries, community
agencies). (open-ended)
Does
your library branch use performance measures (e.g., satisfaction
surveys, participant feedback) to assess the quality of digital literacy
instruction? (Yes/No)
If
yes,
Performance
measures used by my library branch to assess the quality of digital literacy
instruction are: (7-point Likert-type scale from 1 strongly disagree to 7
strongly agree).
a)
Helpful.
b)
Comprehensive.
c)
Applied consistently.
Please
share your thoughts on the performance measures used by your library branch to
assess the quality of digital literacy instruction. (open-ended)
If
no,
Has
your library branch or library system tried to establish such measures in the
past? (yes/no)
Library Branch Program Components
The
timing of digital literacy instruction at my library branch (when
courses are offered: season, day, time) is: (7-point Likert-type scale from 1
strongly disagree to 7 strongly agree).
a)
Conducive to course participant schedules.
b)
Conducive to instructor schedules.
c)
Conducive to library branch schedules.
Please
share your thoughts on the timing of digital literacy instruction at your
library branch. (open-ended)
The
digital literacy skills (e.g., basic skills, advanced skills)
taught at my library branch are:
a)
Adequate.
b)
Useful.
c)
Appropriate.
Please
share your thoughts on the digital literacy skills taught at your library
branch. (open-ended)
The
pedagogical approaches (e.g., teaching methods) used in digital
literacy instruction at my library branch are: (7-point Likert-type scale from
1 strongly disagree to 7 strongly agree).
a)
Effective.
b)
Relevant.
c)
Current.
Please
share your thoughts on the pedagogical approaches (e.g., teaching methods) used
in digital literacy instruction at your library branch. (open-ended)
The
marketing approach at my library branch to promote digital
literacy instruction is: (7-point Likert-type scale from 1 strongly disagree to
7 strongly agree).
a)
Effective.
b)
Timely.
c)
Comprehensive.
Please
share your thoughts on the marketing approach at your library branch to promote
digital literacy instruction. (open-ended)
Demographics
What
is your age? _______ years old
What
is your gender? Man/woman/I identify as (please specify)/Prefer not to answer
What
is your highest level of education? (options: high school or less; college
diploma; undergraduate degree; master’s degree; doctoral degree)
For
how many years have you worked in the library field? (open-ended)