Web Usability Policies/Standards/Guidelines (PSGs) do not Influence Practices at ARL Academic Libraries

Authors

  • Shandra Protzko National Jewish Health

DOI:

https://doi.org/10.18438/B89P6Q

Keywords:

academic librarianship, Web usability, policies, standards and guidelines,

Abstract

A Review of:
Chen, Yu-Hui, Carol Anne Germain and Huahai Yang. “An Exploration into the Practices of Library Web Usability in ARL Academic Libraries.” Journal of the American Society for Information Science and Technology 60.5 (2009): 953-68.

Objective – To survey the current status of Web usability Policies/Standards/Guidelines (PSGs) found in academic libraries of the Association of Research Libraries (ARL). Researchers sought to investigate whether PSGs are in place, the levels of difficulty surrounding implementation, the impact of PSGs on design, testing, and resource allocation, and the relationship between ARL ranking and usability practice or PSGs.

Design – Survey.

Setting – North America.

Subjects – Academic libraries of the ARL.

Methods – An 18-question survey consisting of multiple choice, Likert scale, and open-ended questions was sent to all 113 ARL libraries in November 2007. Survey recipients were selected as the person in charge of Web site usability by visiting library Web sites and phone inquiry. The survey was concluded in January 2008 with a response rate of 74% (84 institutions). The researchers used t-test to detect any difference in ARL library ranking between libraries with and without PSGs. Pair-wise t-tests were conducted to identify gaps in difficulty implementing PSGs. In addition, they used Pearson’s Correlation to investigate any significant correlations between variables such as ARL rank and resource allocation.

Main Results – Of the 84 respondents, 34 (40%) have general library Web PSGs and 25 (30%) have specific usability PSGs; 41 (49%) have at least one type of in-library PSG. Of the 43 (51%) libraries that do not have PSGs, 30 (36%) are at universities with institutional Web usability PSGs; 26 (87%) follow those guidelines. There was no statistically significant relationship between ARL ranking and PSG status (see Table 1).


The authors asked about difficulty in implementing PSGs. Of the 32 libraries responding to a question about general library Web PSGs, most had slight or moderate difficulty. Twenty-three libraries with specific usability PSGs identified difficulty levels; some had no difficulty, but a majority had moderate difficulty. For the 26 libraries using institutional Web usability PSGs, most had no or slight difficulty. Pair-wise t-tests showed that library Web usability PSGs were significantly more difficult to implement than university Web usability PSGs.

Enforcement/agreement issues were reported as the primary difficulty in implementing in-library PSGs. Technical issues and ambiguity were obstacles at the institutional level. More than half of the 84 libraries have Web advisory committees and about one third have usability committees or Web usability subcommittees. Several libraries answered that they have none of these committees, but indicated that they have some sort of ad hoc committee or user study group to address usability issues.

Of the 84 respondents, 71 (85%) have conducted usability testing. Sixty-two libraries (73.8%) rated usability testing as important, very important, or extremely important: the rate given for the importance of usability testing did not correlate with ARL ranking. Cited most often in open ended questioning
were the importance of iterative testing, library wide buy-in, and staff and resource availability. Main web pages were tested most















frequently. Fifty-three libraries (74.6%) tested their lower level pages at least once. OPACs were tested the least often. The amount of testing was impacted neither by the existence of library Web PSGs nor usability PSGs. The top two testing methods were in-person observation and think aloud protocol.

Of the 84 libraries, 24 (28%) reported having staff dedicated to Web usability issues; twenty full-time staff and four part-time staff. There was a weak association between ARL ranking and hours worked by dedicated staff; no association existed for regular staff who take on Web responsibilities. Fifty-one (60%) of libraries had regular staff whose duties included Web usability; forty-six full-time and five part-time. Training did not correlate to amount of testing methods used. There was a weak link between ALR ranking and availability of resources and, the authors showed, more testing was done as resources increased. In response to a query about future Web usability plans, the focus was on usability testing and site redesign, with only three libraries planning to refine or establish usability PSGs.

Conclusion – The authors hypothesized that “web usability PSGs would influence usability practice within libraries and other institutions” (953). The data show that PSGs do not influence practices. The authors conclude that there is no significant relationship between PSGs and testing practices or PSGs and the availability of resources. Likewise, ARL ranking had no effect on the establishment of usability PSGs. Most libraries are conducting usability testing, and there was a weak link between ARL ranking and availability of testing resources. Highlighted in the open-ended questions is the lack of usability expertise among stakeholders. Workload, inadequate human resources, and lack of organizational cohesion are also cited as barriers to the adoption of Web usability PSGs. The authors speculate that Web professionals likely use their own working knowledge and internalized guidelines without having formal documentation. The authors further speculate that the difficulty related to creating mental models that adequately represent library tasks may hinder the use of formal usability PSGs. Additionally, libraries may not regard the lack of usability PSGs as a liability, especially in light of the lack of government mandates or standards. The authors recommend educational efforts for key players on the value of Web usability, support for hiring dedicated staff, and formal documentation to guide design practice. The authors plan to compare the collected PSGs in an upcoming project. Future research could focus on non-ARL libraries, the relationship between PSGs and user experience, and Content Management System (CMS) usability characteristics.

Downloads

Download data is not yet available.

Author Biography

Shandra Protzko, National Jewish Health

Information Specialist, Tucker Medical Library, National Jewish Health, Denver, Colorado, United States

Downloads

Published

2009-12-14

How to Cite

Protzko, S. (2009). Web Usability Policies/Standards/Guidelines (PSGs) do not Influence Practices at ARL Academic Libraries. Evidence Based Library and Information Practice, 4(4), 67–70. https://doi.org/10.18438/B89P6Q

Issue

Section

Evidence Summaries

Similar Articles

1 2 3 4 5 6 7 8 9 10 > >> 

You may also start an advanced similarity search for this article.