Top 5 problems with library websites – a review of recent usability studies

What are the most common UX problems with academic library websites and library tools?  I looked at 16 studies conducted over the past two years, and here is what I learned:

1NumberOneInCircleWhat does that mean?  Library jargon

This was by far the most cited problem: 10 out of 16 studies reported library jargon.  Not surprising, considering a recent review of library web sites that found only 49% to be jargon-free [1].   Terms that were problematic:

  • Catalog or discovery tool: “catalog,” “COPAC,” “LINK+,” “Engine Orange”
  • Fulfillment: “Find It @ UIC”, “360Link”, “Get it,” “location”
  • Journal and database terminology: “Databases,” “Periodical,” “Serial”
  • Research links: “Research guides,” “Reference Sources,” “E-shelf,” “Collections”
  • Locations: “Course reserves,” “Reference”

2NumberTwoInCircleWhat am I searching?  Understanding search tools

In 7 of the studies, users did not understand what was included in search tools.  A usability study of NCSU’s single search box found that 23% of searches were not for article or book content [2], while at CSU Fresno, users typed database subjects into the catalog search box instead of navigating to the databases page [3].  Libguides search [4] and site search boxes [5] were also problematic.  See also my post on this issue.

128px-3NumberThreeInCircle.svgWhere am I? Getting lost in silos

6 studies found usability problems when users were transferred to external sites.   Problems included: link resolver [6], publisher and database sites [7],  consortial/ILL catalogs [8], and needing to authenticate too many times [9].  One study summed it up: “library web sites continue to be a compilation of information silos.” [10]

4NumberFourInCircleWhat is it? Understanding bibliographic formats and relationships

  5 studies found that students had difficulty understanding the relationship between “articles” and “journals” [11].  One study also found students had trouble distinguishing “books” from “book reviews.” [12].  

5NumberFiveInCircleHow do I get it?  Difficulty Finding Full-Text

5 studies found that users had difficulty getting to resources. Both students [13] and faculty [14] struggled with finding and navigating links to PDF full-text.  Users also had difficulty finding how to request books not owned by the library [15].

And one bonus problem: 

6NumberSixInCircleWhere is it?  Navigating with tabs

4 studies noticed that users often did not see or use tabs in search tools and LibGuides [16].   This could be a problem for the 52% of libraries [17] that use tabbed search boxes!




1.    In their review of library websites, Chow, et al. (2014) found that 49% of headings, titles, and links were jargon-free in 102 academic sites surveyed.  [back to top]

2.  Lown, Sierra, and Boyer (2013) performed a usability study of the library’s single search box.  They found that “about 23% of use of QuickSearch took place outside of either the catalog or book modules, indicating that NCSU Library users attempt to access a wide range of information from the single search box.” (pg. 240)  [back to top]

3.  In their study of the library’s website, Newell, et al. (2013) noted: “Many of the problems students had with the site during usability testing resulted from their expectation that the search box would do more to help them accomplish the tasks than it did.” (pg. 242) [back to top]

4.  Sonsteby and Dejonghe (2013) performed usability testing of LibGuides and found that “participants wanted a search box and expected it to behave as a discovery tool.” (pg. 86) [back to top]

5.  In their review of usability testing on the library website, Brown and Yunkin (2014) found that when prompted to use the library site search, users instead searched the college’s site search: “the UNLV search box was used on multiple occasions. Past usability testing has shown that having multiple search boxes is confusing.” (pg.39)  [back to top]

6. In a study comparing two discovery tools, Djenno, et al. (2014) noted that participants were “confounded by the intermediate page of the link resolver ” because it bore “little resemblance to the discovery environment leading to it.” (pg. 276)  [back to top]

7.    Imler and Eichelberger (2014) looked at how students navigate commercial database sites to find full-text articles.  They found: “database design was more of a deterrent to task completion than student misunderstanding of library terminology.” (pg. 285).  See also my summary of their study in this post.   [back to top]

8. In their study of a resource sharing catalog, Jones, Pritting & Morgan (2014) found that “when users entered the IDS Search OPAC through the library’s homepage, they became confused by the change in interfaces.”  See also my summary of this article.    [back to top]

9. Bull, Craft, and Dodds (2014) from the University of Birmingham (UK) found that one third of participants complained of an “excessive number of prompts for log-in details” in their discovery system (pg. 154)

  [back to top]

10. Brown and Yunkin, 2014.   See pg. 43.   [back to top]

11. In discovery systems, students had trouble distinguishing “journals” and “articles.” (Nichols, et al., 2014;  Cassidy, et al., 2014).  One study found “failure to complete tasks at times reflected a lack of understanding of the differences between material types or how to find an article within a journal” (Djenno, Insua, Gregory, & Brantley, 2014, p. 278). In a health sciences library study, one participant wasn’t sure whether to start with discovery, e-journals, or databases to find an article on on a given topic (Lemieux & Powelson, 2014).   [back to top]

12.  Known item book searching in a discovery system resulted in confusion due to the fact that both book and book reviews with similar titles were included in results. (Cassidy, Jones, McMain, Shen, & Vieira, 2014, pg. 24)  [back to top]

13.  Cassidy, et al. (2014) noted that 5 out of 10 students “were unable to locate the full-text” of articles, and in their study of full-text article retrieval, Imler and Eichelberger (2014) learned that only 25% of students could successfully find full-text on commercial database sites. In their study comparing two discovery systems, Djenno et al (2014) found that “even if the discovery tool led them to the link resolver correctly, they still did not find the full-text item.”   [back to top]

14. DeRidder & Matheny (2014) tested how faculty members use online databases.  They found that 3 out of 11 participants “could not even find a link to the digitized content on the results page or even a clear indication of how to access that content.”   [back to top]

15. After conducting a study that included a task to find a book not owned by the library, Newell, et al. (2013) concluded that ILL services needed to be more prominent on their site.   [back to top]

16.  CSU Fresno’s usability study found that their tabbed search box was “not intuitive to users” and “not as successful as was originally hoped” (Newell,  et al., 2013).  A usability study of the Primo discovery tool found that users had difficulty finding information about record details presented as tabs: “the information provided in these tabs is not easily accessible.” (Nichols, et al, 2014).  Both Lemieux and Powelson (2014) and Sonsteby and  Dejonghe (2013) found that LibGuides users did not always see or use tabbed navigation.   [back to top]

17.  Jones and Thorpe (2014) surveyed 313 medium-sized academic libraries and found that 52.4% “chose to place the discovery service as a tab within a multi-tabbed search box.”   [back to top]

Works Cited

Bauer, K. (2014). Yale University Library report on Articles+ (Summon 2.0) usability testing May 23, 2014. Retrieved from:

Brown, J. M., & Yunkin, M. (2014). Tracking changes: One library’s homepage over time—findings from usability testing and reflections on staffing. Journal Of Web Librarianship, 8(1), 23-47. doi:10.1080/19322909.2014.872972

Bull, S., Craft, E., & Dodds, A. (2014). Evaluation of a resource discovery service: FindIt@Bham. New Review Of Academic Librarianship, 20(2), 137-166. doi:10.1080/13614533.2014.897238

Cassidy, E., Jones, G., McMain, L., Shen, L., & Vieira, S. (2014). Student searching with EBSCO Discovery: A usability study. Journal Of Electronic Resources Librarianship, 26(1), 17-35. doi:10.1080/1941126X.2014.877331

Chow, A., Bridges, M., & Commander, P. (2014). The website design and usability of US academic and public libraries. Reference & User Services Quarterly, 53(3), 253-265.

DeRidder, J. L., & Matheny, K. G. (2014). What do researchers need? Feedback on use of online primary source materials. D-Lib Magazine, 20(7/8), 54-70. doi:10.1045/july2014-deridder

Djenno, M., Insua, G., Gregory, G. M., & Brantley, J. S. (2014). Discovering usability: comparing two discovery systems at one academic Library. Journal Of Web Librarianship, 8(3), 263-285. doi:10.1080/19322909.2014.933690

Imler, B., & Eichelberger, M. (2014). Commercial database design vs. library terminology comprehension: Why do students print abstracts instead of full-text articles?. College & Research Libraries, 75(3), 284-297.

Jones, S.L. & Thorpe, A. (2014) Library homepage design at medium-sized institutions. Journal of Web Librarianship, 8(1), 1-22. doi: 10.1080/19322909.2014.850315

Jones, W. E., Pritting, S., & Morgan, B. (2014). Understanding availability: Usability testing of a consortial interlibrary loan catalog. Journal Of Web Librarianship, 8(1), 69-87. doi:10.1080/19322909.2014.872967

Lemieux, M., & Powelson, S. (2014). Results of a usability study to test the redesign of the Health Sciences Library web page. Journal Of The Canadian Health Libraries Association (JCHLA), 32(2), 49-54. doi:10.5596/c14-023

Lown, C., Sierra, T., & Boyer, J. (2013). How users search the library from a single search box. College & Research Libraries, 74(3), 227-241.

MIT Libraries (2014). 2014 – Homepage search options. Retrieved from:

Newell, P.A., Delcore, H.D, Dinscore, A., Cowgill, A., & McClung, J. (2013). Collaborating for change: Leveraging campus partnerships to create a user-centered library website. Internet Reference Services Quarterly, 18(3-4), 227-246. doi: 10.1080/10875301.2013.852648

Nichols, A., Billey, A., Spitzform, P., Stokes, A., & Tran, C. (2014). Kicking the tires: A usability study of the Primo discovery tool. Journal Of Web Librarianship, 8(2), 172-195. doi:10.1080/19322909.2014.903133

Niu, X., Zhang, T., & Chen, H. (2014). Study of user search activities with two discovery tools at an academic library. International Journal of Human-Computer Interaction, 30(5), 422-433. doi:10.1080/10447318.2013.873281

Sonsteby, A., & Dejonghe, J. (2013). Usability testing, user-centered design, and LibGuides subject guides: A case study. Journal Of Web Librarianship, 7(1), 83-94. doi:10.1080/19322909.2013.747366







8 thoughts on “Top 5 problems with library websites – a review of recent usability studies

  1. Your first footnote references Chow’s article, but I am unable to find any reference to jargon or the 49% statistic, and the survey in the cited article was conducted on 1,469 sites and not 102.

    Did you mean to site a different article?

    1. Hi Stephen,

      Thanks for your comment! That is the right article, but it is a bit cryptic, I agree. Chow et al. conducted both a survey (1266 libraries responded) and an analysis of websites. For the analysis, the authors measured library websites (102 of which were academic) using a “Usability Checklist.” The results for that include the 49% statistic, see table 3 on page 259: “Are headings, title, and links jargon-free?” Public libraries did much better in this area, interestingly. I hope this helps! Please let me know if you have more questions. Best, Emily

Leave a Reply

Your email address will not be published. Required fields are marked *