Eyetracking Study of the Kent State Library
The Objective
Interpreting an eyetracking study of the Kent State library website was one of my projects for my Information Architecture Master's program. The goal was to determine how easy it would be for users to request that the library purchase a book it did not have, and identify where they looked for that function on the screen.
Method
In the test, a moderator gave five participants (students in their 20s) a scenario in which the Kent State library did not have their favorite book. They were then asked to use the library website to suggest that the library purchase the book. Eyetracking software was used to track the participants' gazes, and they were filmed so we could record the participants comments, movements and facial expressions. The moderator was careful not to interrupt, as this can divert the participant's focus away from the screen.
Results
Only one out of five participants was able to complete the task, and the one who did experienced great difficulty. Most users performed an exhaustive review of the Services, About Us, and Help sections, as well as the left navigation of interior pages, which could indicate that these pages are cluttered or poorly organized. There were several miscues that derailed users, such as a tab called "Books & More," an "Ask Us" link, and a "Quick jump to" dropdown. Only two users bothered to scroll to the footer where the action was located, as footers are known to be low-priority real estate in web and UX design. The library site has since been redesigned to be more user-friendly. Check out detailed results for each participant in the full report below.
Lesson Learned
There were a couple problems with the way this study was conducted. First, in many of the tests, the user freely scanned the page before knowing the task. The moderator should have given the task first before launching the page, which would have generated more reliable gaze plots.
Second, there was a high rate of task abandonment, which indicates that users may not have been emotionally invested in the task. Even participant 4 stated that this was a task she had never tried to do before. In their book Eyetracking Web Usability, Jakob Nielsen and Kara Pernice state that unrealistic tasks create unrealistic eyetracking results. In future eyetracking studies, I would recommend allowing participants to play a role in choosing their tasks so they are more likely to identify with them and attempt to complete them.
Read more about my thoughts on eyetracking on my blog.