Results of a Usability Study to Test the Redesign of the Health Sciences Library Web Page

PROGRAM DESCRIPTION / DESCRIPTION DU PROGRAMME

Results of a Usability Study to Test the Redesign of the Health Sciences Library Web Page1

Michelle Lemieux and Susan Powelson

Michelle Lemieux2. Knowledge Resource Service, Peter Lougheed Centre, Rm. 0634 3500 26th Avenue NE, Calgary, Alberta, T3B 6A8.
Susan Powelson. Health Sciences Library, University of Calgary, Rm 1494A, 3330 Hospital Drive NW, Calgary, Alberta, T2N 4N1.

1This article has been peer reviewed.

2Corresponding author (e-mail: mrlemieu@ucalgary.ca)

JCHLA / JABSC 35: 49–54 (2014) doi: 10.5596/c14-023

Introduction

In September 2012, Libraries and Cultural Resources at the University of Calgary (U of C) launched a new main library web site establishing new standards for design. Branch library web site redesign followed in November. The redesign involved standardizing the formatting of the library's web sites. Previously, branch library web sites differed significantly from the main library site. In the new design each web page would have the same navigation bar along the top along with a web-scale resource discovery (discovery) search box. Web page content was constrained within a maximum of eight boxes, four on each row (a wireframe representation is shown in Figure 1). Boxes could be combined to increase the width, but the heights could not be increased. These new standards required that the Health Sciences Library (HSL) significantly rework its web site. To ensure that the redesigned web site worked as intended, a usability study was conducted.

Fig. 1. New web page design standards wireframe.

Fig. 1.

“Usability testing is a systematic way of observing actual users trying out a product and collecting information about the specific ways in which the product is easy or difficult” [1]. It is a simple process that involves a facilitator giving a participant a list of tasks to do and asking them to think aloud while doing them [2]; this is important for capturing not only the emotional and aesthetic responses users have to web sites but also their satisfaction with the layout and logic [3]. Dumas and Reddish [1] list five aspects of usability testing:

  1. Improve the usability.
  2. Participants are real users.
  3. Participants do real tasks.
  4. Participants are observed and recorded doing these tasks.
  5. Results are analyzed to identify problems and solutions.

Description

A brainstorming session was held with five HSL librarians and two members of the HSL reference staff to redesign the home page. Post-it notes were used to represent the boxes of content possible within the new design structure. Post-it notes provided flexibility to move the content around during the session and to easily visualize changes. To avoid duplicating links, the page was altered from a discipline focus (medicine, nursing, and veterinary medicine, Figure 2) to a task focus (major resources, clinical care tools, research, Figure 3). An attempt was made to avoid library jargon as other usability studies have demonstrated that this can be a barrier for patrons [4, 5]. The final design from this meeting was agreed upon and then posted in a library workroom so all HSL staff members could provide feedback. To ensure that the new design was easy for patrons, a usability study was conducted by two HSL librarians. To best incorporate users’ needs, user tests need to be run early [6], so the usability study was conducted on a beta version of the web site so that any major flaws could be corrected before the site was officially launched.

Fig. 2. Original Health Sciences Library home page.

Fig. 2.

Fig. 3. Redesigned HSL web page for testing.

Fig. 3.

Task list development

The first step was to develop a task list, a critical component of usability testing [7]. Letnikova [8] surveyed question sets used by academic libraries to find ways to improve the accuracy of usability testing by suggesting a list of standardized questions, noting that careful attention needs to be given to indications that show that the participants understand the tasks. Three different HSL librarians were consulted about which aspects of library services they considered the most important to our patrons and therefore the most critical to test. Throughout development, it was desirable to strike a balance between testing all possible web site functions, and keeping the test to less than 15 minutes to minimize inconvenience to the participants. In the end, a list of eight questions, which was ordered from easiest to hardest, was developed [4]. The questions were phrased using language different than that on the web page, focusing on the activities that the patron heading to the web page would wish to complete. The list of tasks can be found in Figure 4. Tasks ranged from booking a study room, to finding full text of an article, to researching a topic.

Fig. 4. Usability test questions.

Fig. 4.

Participant recruitment

A minimum of three participants was needed for each distinct user group [9]. The user groups of interest to HSL were students and faculty, thus a minimum of six participants was required. To recruit participants, study details were posted in the library and on electronic bulletin boards in the cafeteria in the main Health Sciences building. E-mails were also sent out to student mail lists. Unfortunately these approaches were not successful, and only one participant was recruited by mass e-mails. We changed our recruitment approach by soliciting people using the computers in the library, asking anyone who came in for a search consultation, and by contacting individual faculty members known to the librarians. Eight participants were recruited: four faculty members and four students.

Usability procedure

The procedure for the usability testing followed the recommendations from Krug [2] and was conducted by the primary author. Camtasia screen-recording software was used to record the voice and actions of the participants. A pretest was completed with the team's administrative assistant to confirm the effectiveness of the Camtasia technology and the length of test. An identical test script describing the test was read to each participant at the beginning of the test. Starting from the beta HSL web site on a desktop computer, the participants were encouraged to think aloud, describing any actions and decisions they were making. After the usability tests were completed, each Camtasia video was reviewed independently by the primary author and a second librarian to identify themes and issues raised during the course of the testing. The additional reviewer was included to help reduce bias in interpretation [10]. The independent reviews were conducted the same day as the tests were completed so that the results were still fresh [11]. After all of the tests were completed, the two reviewers compared their notes and came to a consensus as to necessary web page changes [2] and key learnings.

Outcomes

The tests indicated that, for the most part, the new web site functioned as desired. The results that were found to be the most interesting and informative from the testing were:

  • Six of the eight participants (75%) were inadvertently booking a workroom at the Taylor Family Digital Library (the main library on campus) instead of at the Health Sciences Library.
  • It was at least 30 seconds faster to find a known article using the discovery search box instead of searching by journal title and then navigating to find the correct volume, issue, and page number. Four participants (50%) used the discovery search box to find their article. Two (25%) searched by journal name and then navigated to the correct volume and issue. The remaining two (25%) started with the citation matching tools in PubMed/MEDLINE.
  • When researching a topic, five users started with PubMed (62.5%), one user started with Ovid MEDLINE (12.5%), two users started with the discovery search box (25%), and one user started looking in e-journals (12.5%).
  • When presented with a list of databases by subject, three participants mentioned familiarity as a factor in deciding which database to use. For example, when looking at a list of education databases, one participant said “I'd probably use Google Scholar; that's the only one that I recognize.”
  • The majority (62.5%) of participants could not find the library Research Guide (LibGuide) about how to cite an article. Problems included difficulty finding the Research Guides, finding the correct guide among several citation Research Guides that have been created, and difficulty navigating within the guide (did not see the page tabs). Three participants indicated that they would not use the library web site to find information on how to cite an article.
  • Three out of four (75%) faculty members mentioned that they would “get someone else to do it for them” in many cases instead of doing the searching themselves.
  • The largest delays in the test occurred when participants had to scroll down to find links.
  • The majority of users used the main navigation bar instead of the shortcut links included on the page. For example, only one participant used the Place an Interlibrary Loan link under Quick Links and only three used the Databases link.

Discussion

Branch web pages vs main library pages

One consequence of the unification of design between the branch web pages and the main library web site was that patrons were not aware when they had left the HSL web page. This resulted in patrons booking a workroom at the main library instead of HSL. This finding is supported by anecdotal evidence from the HSL reference staff that have had students come to the reference desk looking for help locating their booked workroom only to discover that it's at the main library. To fix this issue, the ability to book a workroom at HSL from the main library web page was added.

Who to test

Because 75% of faculty members mentioned that they would not do some of the test tasks themselves but would have an administrative assistant do it for them, support staff should be added as a third category of patron to test in addition to faculty and students in any future usability studies.

Research guides

Users have trouble navigating within LibGuides (titled Research Guides on the U of C library site) because they do not notice the LibGuide tabs along the top. Other usability studies have noted similar issues with the LibGuides product [1214]. Based on the results of the usability testing, HSL modified its citation management tools guide to add a left-hand menu bar containing links to all of the guide's pages to the guide home page. This was found to be an effective improvement to navigation by Eastern Michigan University [13]. As these Guides are used system wide, this finding has implications for the wider library system. Clips of the Camtasia videos were passed along to the U of C LibGuides working group along with a recommendation that the Research Guides as whole undergo usability testing. This testing should be done on an ongoing basis as suggested by Sonsteby and Dejonghe [15].

The lack of awareness of our participants of Research Guides is consistent with what Ouellette [14] discovered at the University of Alberta and MacEwan University. Libraries need to do a better job of promotion to create awareness. One concern is that of the three participants who said they would not go through the library to find citation information, two were faculty members. Because one of the reasons students use Research Guides is because they are directed to them by their instructors [14], promotion should start with faculty members.

The single search box

A web-scale discovery service is a single search box that allows patrons to search the catalogue, freely available web-based content, and full-text of articles from all of the library's subscription databases at the same time [16]. Originally, the HSL librarians believed that the discovery search box was relevant only for general undergraduate programs and should not be included on the HSL branch page. However, because it was a system-wide fixed element it could not be excluded. The usability testing demonstrated the value of the discovery search box to our patrons and changed staff attitudes.

Although there was an advantage to the single search box to find full text of articles faster and 50% of participants took advantage of it, there was still a lack of understanding of its full potential. Only two participants used the discovery search box to search their topic. One participant actually said “So that's books and articles.....” and did not use it because they did not associate that with searching a topic.

Library jargon

Kupersmith [5] has been tracking usability studies since 2002 to help reduce the amount of library jargon on library web sites. This study, along with the 51 studies that Kupersmith summarized, confirms that there is still language that our users do not understand. Participants didn't know what Research Guides meant. “Databases” is still library jargon. One participant started with e-journals when researching a topic to find articles because articles are in journals. Once at the e-journal list, the participant said “not exactly sure if I'm looking for a journal or a journal article” showing awareness that they were in the wrong place. The participant then asked if databases would be the correct place to search. This is an ongoing problem with no easy solution.

Familiarity

One key finding for instruction going forward is that users will go to resources they are familiar with; 75% of users started their search with PubMed/MEDLINE and the only other databases mentioned were Google Scholar and the point-of-care tools UpToDate and DynaMed. One participant even referred to PubMed as their “trusted companion”. The preference for PubMed/MEDLINE for searching is consistent with other institutions [17]. Librarians need to ensure there is name awareness of other key databases.

Important information above the fold

It is best practice in web design to keep important content above the fold [18, 19]. Because of this, and the participants’ difficulty locating links they had to scroll down to find, all content that HSL wanted to highlight was moved to the top row of boxes under the navigation bar in the final design (Figure 5). For example, HSL wanted to promote its instruction classes, so the HSL News box was switched with the chat box to make it more visible in the final design. Because of its location below the fold, the number of links under Quick Links was minimized by removing the Place an Interlibrary Loan link and moving the Workshops links from under Quick Links up to the Research Section, replacing the Databases link. We felt confident removing these links because the majority of participants did not use them.

Fig. 5. Final web page design (with changes from pretest highlighted in red).

Fig. 5.

Study limitations

There are some elements that we would change for the next usability study.

Selection bias may have been introduced by only including library users and friends of the library staff in the testing process. Additional information may have been obtained had we been able to recruit nonlibrary users. When the main U of C library did their testing, they advertised an incentive in an effort to recruit more participants.

The article title chosen for the participants to search was fairly unique. A future test could include a search for a more common title or something that would result in multiple similar results to determine whether or not patrons could still retrieve the full text despite it being more complicated to find.

Participants’ ability to navigate to the HSL branch web page was not tested (i.e., users were given the page as their starting point). This was partly because the web site was still in beta mode and was not yet linked to the main library web page. However, after the launch, reference staff commented that discoverability was an issue with the site.

The web site was only tested on a desktop computer. No information was gained about the functionality of the site for mobile users.

The accessibility of the site for disabled patrons was not tested.

Conclusions

The usability testing was easy for librarians to do themselves; the hardest part was participant recruitment. Though the testing revealed minimal changes were necessary to the original proposed design (Figure 5), we learned valuable information about how our patrons use our web site, which will help us design our instruction and Research Guides in the future. Usability testing takes a snapshot of users’ needs and understanding at a particular point in time. Consequently we recommend that this type of testing not just be done when a redesign occurs. Testing on a more frequent basis will ensure that the library's most important marketing and access tool, the web site, will be continuously evaluated and updated, incorporating users’ changing needs and experiences.

References

1. Dumas JS, Redish JC. A practical guide to usability testing. 2nd ed. Portland (OR): Intellect Ltd; 1999. 404 pp.
2. Krug S. Rocket surgery made easy: the do-it-yourself guide to finding and fixing usability problems. Berkeley (CA): New Riders; 2010. 161 pp.
3. Ipri T, Yunkin M, Brown JM. Usability as a method for assessing discovery. Inf Technol Libr. 2009 December;28(4):181–3. doi: 10.6017/ital.v28i4.3229.
4. Duncan V, Fichter DM. What words and where? Applying usability testing techniques to name a new live reference service. J Med Libr Assoc [Internet]. 2004 April [cited 25 April 2014];92(2):218–25. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC385303/.
5. Kupersmith J. Library terms that users understand [Internet]. UC Berkeley: UC Berkeley Library; 2012 February 29 [cited 25 April 2014]. Available from: http://escholarship.org/uc/item/3qq499w7.
6. Nielsen J, Tahir M. Keep your users in mind. Internet World [Internet]. 2000 December 15 [cited 25 April 2014];6(24):43. Available from Academic OneFile: http://bit.ly/1ikLNsI.
7. Wilson C. Taking usability practitioners to task. Interactions. 2007 January–February;14(1):48–9. doi: 10.1145/1189976.1190004.
8. Letnikova G. Developing a standardized list of questions for the usability testing of an academic library web site. J Web Librariansh. 2008 April;2(2/3):381–415. doi: 10.1080/19322900802205817.
9. Nielsen J. Why You Only Need to Test with 5 Users [Internet]. Freemont (CA): Nielsen Norman Group; 2000 March 19 [cited 25 April 2014]. Available from: http://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/.
10. Vermeeren A, van Kesteren I, Bekker T. Managing the “evaluator effect” in user testing. Proceedings of IFIP INTERACT03 Human-Computer Interaction. 2003 September 1–5; Zurich, Switzerland. IFIP Technical Committee No 13 on Human-Computer Interaction; 2003. p. 647–51.
11. Norgaard M, Hornbaek K. What do usability evaluators do in practice? An explorative study of think-aloud testing. Proceedings of the 6th Conference on Designing Interactive Systems. 2006 June 26–28; University Park, PA. New York (NY): ACM; 2006. p. 209–18. doi: 10.1145/1142405.1142439.
12. Corbin J, Karasmanis S. Health sciences information literacy modules usability testing report [Internet]. Bundoora (AU): La Trobe University; 2009 [cited 25 April 2014]. Available from: http://arrow.latrobe.edu.au:8080/vital/access/manager/Repository/latrobe:20690.
13. Pittsley KA, Memmott S. Improving independent student navigation of complex educational web sites: an analysis of two navigation design changes in LibGuides. Inf Technol Libr. 2012;31(3):52–64. doi: 10.6017/ital.v31i3.1880.
14. Ouellette D. Subject guides in academic libraries: A user-centred study of uses and perceptions. Can J Inf Libr Sci. 2011 December;35(4):423–435. doi: 10.1353/ils.2011.0023.
15. Sonsteby A, Dejonghe J. Usability testing, user-centered design, and LibGuides subject guides: A case study. J Web Librariansh. 2013;7(1):83–94. doi: 10.1080/19322909.2013.747366.
16. Popp M, Pagliero Dallis D. Planning and implementing resource discovery tools in academic libraries. Hersey (PA): IGI Global; 2012.
17. De Groote SL, Dorsch JL. Measuring use patterns of online journals and databases. J Med Libr Assoc [Internet]. 2003 April [cited 17 April 2014];91(2):231–40. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC153164/.
18. Dickstein R, Mills V. Usability testing at the University of Arizona Library: How to let the users in on the design. Inf Technol Libr [Internet]. 2000 September [cited 27 May 2014];19(3):144–51. Available from: http://ezproxy.lib.ucagary.lca/login?url=http://search.proquest.com/docview/215830070?accountid=9838.
19. Golombisky K, Hagen R. White space is not your enemy: A beginner's guide to communicating visually through graphic, web and multimedia Design. 2nd ed. Saint Louis, MO: Focal Press; 2013.

Refbacks

  • There are currently no refbacks.




 

JCHLA/JABSC | CHLA/ABSC on Twitter